> Basically the loop is because the "readable" event doesn't fire until the 
> buffer is filled up and if you want to get data immediately, then you can't 
> rely on "readable"?

The 'readable' event fires as soon as *any* data is added to the
internal buffer, but only if a previous read() call returned null.  If
you never got a null read, then you haven't exhausted the buffer, so
there's no need to emit 'readable', since presumably you already know
it's readable.

> It would seem (from the docs) that read() without any limit returns the whole 
> buffer, so how would there be more data the next time you call it?

The length of the returned data from read() is implementation-defined.
 In objectMode streams, it'll always be one "thing", but in
binary/string streams it can be any amount of data.

If you're using the stream.Readable base class, then yes, read() will
always return the full buffer, *unless* you're piping, in which case,
it returns the top chunk in the list, so as to avoid an unnecessary
copy in the case where there's more than one chunk ready.


On Mon, May 6, 2013 at 11:10 AM, James Hartig <[email protected]> wrote:
> Sorry to be late to the party...
>
> Basically the loop is because the "readable" event doesn't fire until the
> buffer is filled up and if you want to get data immediately, then you can't
> rely on "readable"?
>
> It would seem (from the docs) that read() without any limit returns the
> whole buffer, so how would there be more data the next time you call it?
>
>
> On Sunday, April 14, 2013 4:57:49 PM UTC-4, Jorge wrote:
>>
>> On 30/03/2013, at 00:56, Isaac Schlueter wrote:
>>
>> > ```javascript
>> > var chunk;
>> > while (null !== (chunk = rs.read())) {
>> >  doSomething(chunk);
>> > }
>> > ```
>>
>> I use to write code like that too but it might break it seems, look:
>>
>> <https://bugs.webkit.org/show_bug.cgi?id=114594>
>>
>> this works:
>>
>> function works (s) {
>>   var pos;
>>   var n= 0;
>>   var t;
>>   var r= "";
>>   var o= "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ";
>>   var p= "5678901234nopqrstuvwxyzabcdefghijklmNOPQRSTUVWXYZABCDEFGHIJKLM";
>>   while (n < s.length) {
>>     t= s[n];
>>     pos= o.indexOf(t);
>>     r+= (pos >= 0) ? p[pos] : t;
>>     n++;
>>   }
>>   return r;
>> }
>>
>> this doesn't:
>>
>> function fails (s) {
>>   var pos, n = 0,
>>     t, r = "",
>>     o = "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ",
>>     p = "5678901234nopqrstuvwxyzabcdefghijklmNOPQRSTUVWXYZABCDEFGHIJKLM";
>>   while (n < s.length) {
>>     r += ((pos = o.indexOf(t = s[n++])) >= 0) ? p[pos] : t;
>>   }
>>   return r;
>> }
>>
>> --
>> ( Jorge )();
>
> --
> --
> Job Board: http://jobs.nodejs.org/
> Posting guidelines:
> https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
> You received this message because you are subscribed to the Google
> Groups "nodejs" group.
> To post to this group, send email to [email protected]
> To unsubscribe from this group, send email to
> [email protected]
> For more options, visit this group at
> http://groups.google.com/group/nodejs?hl=en?hl=en
>
> ---
> You received this message because you are subscribed to the Google Groups
> "nodejs" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/groups/opt_out.
>
>

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to