Hi There, 

Not the perfect solution but it works for your case, based on your system 
do something like this : 

var spawn = require('child_process').spawn,
      ls    = spawn('cmd', ['/c', 'dir /B directoryname']);  // if on 
windows, use ls if on *nix 
ls.stdout.on('data', function (data) {
  ls.stdout.pause(); // you can ignore all this together and use pipe 
instead
  //console.log('stdout: ' + data);
  // split lines and process the file
  // when done call ls.stdout.resume(), here i put it inside a setTimeout 
just to simulate your async process of a file
  setTimeout(function(){ls.stdout.resume();},500);
});

you need to listen for events for stderr ls.on('close ....

Cheers

On Tuesday, October 7, 2014 6:49:52 PM UTC-7, Aseem Kishore wrote:
>
> Hi there,
>
> I have a directory with a very large number of files in them (over 1M). I 
> need to process them, so I'm using fs.readdir() on the directory.
>
> The problem is, fs.readdir() returns everything at once, causing my script 
> to suddenly consume >1 GB of RAM.
>
> AFAICT, there's no way to stream this list instead of returning it all at 
> once. Is there anything equivalent that I can do?
>
> Thanks!
>
> Aseem
>
>

-- 
Job board: http://jobs.nodejs.org/
New group rules: 
https://gist.github.com/othiym23/9886289#file-moderation-policy-md
Old group rules: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/nodejs/679b911a-93f9-4e99-b09b-b502b8e68f91%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to