Am Freitag, den 30.03.2012, 09:07 -0700 schrieb Dave Clements:
> Whenever you read from a data event, each chunk is obviously the size of 
> the buffer, which means pieces of data can be split between two chunks, 
> I'm just wondering how other people reconcile data over the chunking chasm 
> (besides obviously collecting it all first then processing it), 
> and is there some specifically awesome way or module I've yet to learn of?

Really depends on the kind of data you have.

For binary network protocols with small messages, you'll probably do
something like first sending a four-byte size field and then a chunk of
data of that size. When you're receiving data, just parse the length
field and grab that many bytes from the stream.

For plaintext network protocols with small messages, you might have
something like newline-seperated text - in that case, there's a bunch of
modules out there that emit the lines.

For large messages (for example, when you download a dump of the entire
wikipedia or so as one big XML file), you'll probably want something
SAX-based. SAX is basically that you get one event per opened tag, one
per attribute, one per textnode and so on. Based on that, you can use
libraries that collect complete subnodes and emit them - that's a bit
easier to handle. The only library I know that does this is my
https://github.com/thejh/node-halfstreamxml , but there are probably
more.

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to