I was in a similar situation about 2 years ago. Given an incoming PUT 
request with image data I had to save a CouchDB document with the image 
metadata and attach the image itself as an attachment. This was much easier 
to get right if the metadata document could be saved first, and the image 
itself could be attached in a separate request afterwards. I ended up 
writing a function that went something like this:

function wrapImageStream(readableStream, cb) {
....Whenever readableStream emits data or end:
........Save the event name (and arguments) in a queue
........Try feeding the buffered chunks to a function 
(getImageInfoFromBuffers) that returns the metadata if fed enough data to 
determine it, and null otherwise
............If the metadata was returned:
................Create a new "fake" readable stream that emits all the 
original events + future ones emitted by readableStream if it hasn't ended
................Call the callback: cb(null, metadata, fakeReadableStream)
}

This is a pretty nice abstraction that allowed me to pipe the original 
stream anywhere I wanted after I had acquired the metadata.

If I had to do it again, I would implement it as a readable/writable stream 
that guaranteed that a "metadata" event was emitted before the first "data" 
event.

I still have that getImageInfoFromBuffers function lying around somewhere. 
If you need something like that (and someone hasn't open sourced something 
equivalent in the mean time), let me know, and I'll clean it up and get it 
released.

Best regards,
Andreas Lind Petersen (papandreou)

On Wednesday, December 12, 2012 10:27:25 AM UTC+1, Paul Connolley wrote:
>
> Hi there 
>
> Long time lurker, first time poster. I've been working on a module for the 
> last couple of weeks as a bit of a training exercise. I´ve been digging in 
> to node for the last 6 months and I´m trying to make all my modules 
> streaming. The latest exercise is a reverse image proxy for mapping tiles 
> (Openlayers) with caching that actually would have some application in 
> frontend JS that I´ve written. 
>
> I want to store ancillary data externally and then attach it to the stream 
> when piping (akin to how Request and Filed attach relevant header info) and 
> I obviously want to do this asynchronously. My initial thought was to 
> override the pipe method on the stream to load the metadata from file and 
> then call the real pipe once the ancillary data has been retrieved. My 
> concern is whether I would run the risk of losing the streamed image data 
> while waiting for the metadata to load. 
>
> Would it be advisable to keep a buffer of incoming stream data while 
> waiting for the metadata and then emit from the buffer or would it be as 
> simple as pausing and resuming the incoming pipe? At the moment, I´m 
> storing JSON in flat files but I´ve just been refactoring so that I could 
> plug it in to redis or memcached or whatnot. 
>
> As soon as I get to my desk, I will push my current code up to github in 
> case it´s necessary but any advice would definitely be welcomed. 
> Alternatively, if I´m taking the wrong route in solving this problem, any 
> criticism and redirection would be well received. 
>
> Thanks, 
> connrs.

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to