Great! This approach seems to work.  It seems to work fine to call done() 
after the data has been read (but before it has been written.) However, I 
had initially had trouble with the flush. With the flush, care needs to be 
taken to make sure the callback is not called until after the data has been 
written (otherwise it will produce an error with writing to a closed pipe.) 
This sample seems to get the job done. 


NetTransform.prototype._transform = function(chunk, encoding, done) {
    var that = this;
    that.myStuff = that.myStuff ? Buffer.concat([that.myStuff,chunk]) : 
chunk;
    if (that.myStuff.length > 100000) {
        doStuff(that.myStuff, function(newStuff) {
            that.push(newStuff);
        });
        that.myStuff = null;
    }
    done();
};

NetTransform.prototype._flush = function(callback) {
    var that = this;
    if (that.myStuff) {
        doStuff(that.myStuff,function(newStuff) {
            that.push(newStuff);
            callback();
        });
    }
    else {
        callback();
    }
}

var doStuff = function(data, callback) {
    process.nextTick(function() {
        callback(data);
    });
};

On Wednesday, March 5, 2014 1:59:36 AM UTC-8, greelgorke wrote:
>
> i'm not sure, but to mee it looks like there is a misunderstanding:
>
> a transform is built out of a readable and a writable part. with .push you 
> put chunks in the internal buffer of the readable part, so it can inform 
> consumer and provide them the data. with done() you inform the writable 
> part, that the chunk is consumed, and it's ok to consume the next one. both 
> have verry little to do with each other. 
>
> so if you have to wait for 3 chunks before you can push something, then 
> it's ok to wait for them:
>
>
> _transform = function(chunk,encoding,done) {
>       var that = this;
> //first buffer them in your small own buffer
>       that.chunks.push(chunk)
> // then check if you can proceed with transformation
>       if( that.chunks.length === 3){
>         myTransform(chunk,function(data) {
>             that.push(data);
>          }
>      }
> // either you process or wait: at this point you have consumed the 
> incomming chunk, so call done()
>       done();
> }
>
>
> Am Mittwoch, 5. März 2014 01:17:37 UTC+1 schrieb Jeremy Hubble:
>>
>> Thanks!  It seems to get tricky if there are iterations that may not 
>> necessarily callback.  I tried a workaround by calling done before push 
>> data. This works as long as there are more than one chunks, but has 
>> problems if there is only one chunk is received. 
>>
>> _transform = function(chunk,encoding,done) {
>>       var that = this;
>>       myTransform(chunk,function(data) {
>>             that.push(data);
>>        }
>>       done();
>> }
>>
>> One workaround is to simply use a stack of done functions, and them call 
>> them all once some data is retrieved.  
>> On Tuesday, March 4, 2014 1:55:09 PM UTC-8, Timothy J Fontaine wrote:
>>>
>>> You can push as many times in the _transform stream as you like, 
>>> including 0, and you can set an interval that comes back later and pushes, 
>>> or you can opt to wait until _flush is called and then push.
>>>
>>> If you're implementing a protocol parser and may need multiple chunks 
>>> then you just consume the data, and then not push until you have enough 
>>> data to act upon, and then push.
>>>
>>> In other words it doesn't need to be a 1:1 map of _transform to .push, 
>>> but there does need to be a 1:1 map of _transform to done callbacks
>>>
>>>
>>> On Tue, Mar 4, 2014 at 1:05 PM, Jeremy Hubble 
>>> <[email protected]>wrote:
>>>
>>>> Use case is an external service that may need more than one chunk of 
>>>> data to perform its operation.
>>>>
>>>> An analogy would be in some unix commands.
>>>>
>>>> For example:
>>>> cat file.txt | grep someLongString
>>>>
>>>> In node, we get chunks at a time and need to return the result.
>>>>
>>>> Jeremy
>>>>
>>>>
>>>>
>>>>
>>>> On Tuesday, March 4, 2014 12:33:40 PM UTC-8, mscdex wrote:
>>>>>
>>>>> On Tuesday, March 4, 2014 2:45:11 PM UTC-5, Jeremy Hubble wrote:
>>>>>>
>>>>>> Is there a way to construct a transform pipe such that it can receive 
>>>>>> multiple chunks before calling everything back?
>>>>>>
>>>>>>  
>>>>> What's the use case?
>>>>>
>>>>  -- 
>>>> -- 
>>>> Job Board: http://jobs.nodejs.org/
>>>> Posting guidelines: 
>>>> https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
>>>> You received this message because you are subscribed to the Google
>>>> Groups "nodejs" group.
>>>> To post to this group, send email to [email protected]
>>>> To unsubscribe from this group, send email to
>>>> [email protected]
>>>> For more options, visit this group at
>>>> http://groups.google.com/group/nodejs?hl=en?hl=en
>>>>
>>>> --- 
>>>> You received this message because you are subscribed to the Google 
>>>> Groups "nodejs" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>>
>>>
>>>

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to