Hey guys,

I'm working on some semi-large data-sets and I'm attempting to upload them 
to CouchDB.

My logic is fairly simple:
* Read the local file as a stream
* Pipe to JSONStream and pull out each entity
* Listen for data event and save to CouchDB with forceSave: false

What seems to be happening is CouchDB is taking a while to upload each 
entity. This is slowly eating up memory till it falls over.

I can really only see two solutions:
* Combine entities into an array (maybe read 10k entities)?
* or.. throttle the number of events JSONStream can emit.

Does anyone have any reading material which would help? I'm fairly new to 
Stream in NodeJS and CouchDB doesn't seem to have a streamable write.

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to