actually, if your implementation of what you want to do on each file is a correctly implemented writable stream, then backpressure will be handled for you.
On Wednesday, 8 October 2014 03:49:52 UTC+2, Aseem Kishore wrote: > > Hi there, > > I have a directory with a very large number of files in them (over 1M). I > need to process them, so I'm using fs.readdir() on the directory. > > The problem is, fs.readdir() returns everything at once, causing my script > to suddenly consume >1 GB of RAM. > > AFAICT, there's no way to stream this list instead of returning it all at > once. Is there anything equivalent that I can do? > > Thanks! > > Aseem > > -- Job board: http://jobs.nodejs.org/ New group rules: https://gist.github.com/othiym23/9886289#file-moderation-policy-md Old group rules: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines --- You received this message because you are subscribed to the Google Groups "nodejs" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/nodejs/fef6ef5a-a6ef-48d8-b7fa-94b4e93219f3%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
