Little late here, but I use a buffering mechanism for reading like so: 
https://github.com/cretz/node-tds/blob/master/src/buffer-stream.coffee. I 
found myself getting stuck several times where only half of what I expected 
was available, so I implemented a "transaction" type of thing that rolls 
back what I've read since the last set of reads when the index is out of 
bounds.

On Friday, March 30, 2012 11:07:41 AM UTC-5, Dave Clements wrote:
>
> Hi all, 
>
> Whenever you read from a data event, each chunk is obviously the size of 
> the buffer, which means pieces of data can be split between two chunks, 
> I'm just wondering how other people reconcile data over the chunking chasm 
> (besides obviously collecting it all first then processing it), 
> and is there some specifically awesome way or module I've yet to learn of?
>
> Best to everyone, 
>
> Dave
>

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to