I've run into a problem with Node 0.6.2 (running on Ubuntu Linux) that
looks like a buffer overflow. Not sure if there's a workaround, or
whether it's something that is fixed in a later Node build....or
something daft I'm doing.
Node is connected to a child process that is streaming a lot of text.
The Node process is happily picking it up using a
stdout.on("data", ...) event handler. This is appending the incoming
text to a variable until a terminating sequence of characters is
detected, essentially:
var dataStr = data.toString();
contentStr = contentStr + dataStr;
if (terminatorFound(contentStr) {
// process the contentStr string
contentStr = '';
}
So pretty standard stuff I believe.
However, it turns out that if the child process streams out more than
32k of data, it goes berserk with all kinds of chunks of garbage being
apparently received by the "data" event. I've been able to adjust the
output from the child process to determine that this is the case - if
I get the child process to write out less than 32k before sending the
terminating character sequence and stopping transmitting, it behaves
impeccably. As soon as I stream out more than 32k, it goes crazy and
I have to stop the Node process.
I can reproduce this behaviour repeatedly.
Any suggestions welcome.
Rob
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en