Solved by creating actual stream implementing _read.
The trick is to call push on each _read.
When there is no data, delay pull until data available.

In my stream  `_read` i call `read` of fs.ReadStream.

As I understand I should either rely on  `fs.ReadStream.read` and 'end'
event.
Or 'data' and 'end' events and  `fs.ReadStream.pause/resume`, but not mix
them.

here is gist
https://gist.github.com/xdenser/8887437

it needs more details to handle errors
and also I have doubts about resume event
maybe it is better to `watch` file instead of listening for 'data' event on
download stream.





2014-02-08 2:12 GMT+02:00 Denys Khanzhyiev <[email protected]>:

> Hello,
>
> I have a task where one slow stream is piped to fs.writeStream and after
> some event I need to read from that writen file, i.e. read from growing
> file from some position. A have seen `node-growing-file`, and
> `tailing-stream` nothing seems to solve my problem.
>
> It looks like I do not understand how streams work
> Here is my helper object (though it is called PxyStream it is not stream
> in fact),
>
> var
>    fs = require('fs');
>
> function PxyStream(path,readStream,writeStream,start,end){
>     this.path = path;
>     this.readStream = readStream;
>     this.writeStream = writeStream;
>     this._offset = start;
>     this.endPos = end;
>     this.writeStream.on('finish',function(){
>         this._writeStreamFinished = true;
>         this.nextStream();
>     }.bind(this))
> }
>
>
> PxyStream.prototype.pipe = function(destination){
>     this.destination = destination;
>     this.nextStream();
> }
>
> PxyStream.prototype.nextStream = function(){
>     if(!this._stream){
>         var options = {
>            start: this._offset
>         };
>         var last = this._writeStreamFinished;
>         console.log('new read stream',this._offset, last);
>         this._stream =  fs.createReadStream(this.path,options);
>         this._stream.pipe(this.destination,{end: false});
>         this._stream.on('data',function(data){
>             this._offset += data.length;
>         }.bind(this));
>
>         this._stream.on('end',function(){
>             console.log('read stream end',this._offset, last);
>            this._stream.unpipe();
>            this._stream = null;
>            if(last) {
>                this.destination.emit('end');
>            }
>            this._watch();
>         }.bind(this));
>     }
> }
>
> PxyStream.prototype._watch = function(){
>    this.readStream.once('data',function(){
>        this.nextStream();
>    }.bind(this))
> }
>
> exports.PxyStream = PxyStream;
>
>
> I am using it as
>
> pxyStream = new PxyStream(filePath,<slowReadStream>,
> <fsWriteStream>,start,null);
> // i need end position too but lets skip it for now
> pxyStream.pipe(<otherSlowStream>);
>
> my problem is I see 'read stream end' message far before otherSlowStream
> ends.
> In fact it never ends, but i can see its progress.
> Actually destination is http.response stream.
> I thought stream.pipe should slow down reading in order to keep buffers
> short.
> Maybe attached 'data' event makes it read fast, but how can I count bytes
> then?
> The other problem is I can not end destination properly.
>
>
>
>

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to