All you need is a wrapper that give you an async read function:

stream.read(callback, [len])

Without len, it calls the callback when it gets the next chunk. With len 
argument, it buffers the data until at least len bytes have been received, 
passes the len first bytes to the callback, and keeps the rest for the next 
read. 
Returns null at the end of the read operation.

And, if you use streamline or fibers on top of it, parsing becomes really 
natural again because you can put your read call in a simple loop. No need 
to set up complex state machines any more.

Source on 
https://github.com/Sage/streamlinejs/blob/master/lib/streams/server/streams.js

And same thing for writer: a simple async write with a callback, that takes 
care of the drain events for you.

Bruno

On Friday, March 30, 2012 6:07:41 PM UTC+2, Dave Clements wrote:
>
> Hi all, 
>
> Whenever you read from a data event, each chunk is obviously the size of 
> the buffer, which means pieces of data can be split between two chunks, 
> I'm just wondering how other people reconcile data over the chunking chasm 
> (besides obviously collecting it all first then processing it), 
> and is there some specifically awesome way or module I've yet to learn of?
>
> Best to everyone, 
>
> Dave
>

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to