On Friday, February 14, 2014 11:47:47 AM UTC-5, Laurent Sertorius wrote:
>
> I use readFileSync method, but i want to read this file in many blocks (1 
> block = 10 000 lines of my bigfile)
>
> fs.readFileSync('./myBigFic').toString().split('\n').forEach(function 
> (line) { 
>
> An idea ?
>

You can use open[1], read[2], and close[3] manually. That way you're not 
blocking and reading everything in at once, but you can pause whenever you 
want and read however much you want at a time.

Another option is to use fs.createReadStream() as Alex suggested and just 
.pause() the stream (and .resume() again once you're read to process more 
data) if you're using 'data' events or stop calling .read() temporarily if 
you are using the stream that way.

[1] http://nodejs.org/api/fs.html#fs_fs_open_path_flags_mode_callback
[2] 
http://nodejs.org/api/fs.html#fs_fs_read_fd_buffer_offset_length_position_callback
[3] http://nodejs.org/api/fs.html#fs_fs_close_fd_callback

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to