Is there no way that you can stream the chunks as you read them as
opposed to buffering it all in memory first?

 

-Chad

 

From: [email protected] [mailto:[email protected]] On Behalf
Of darcy
Sent: Tuesday, November 13, 2012 9:56 PM
To: [email protected]
Subject: Re: [nodejs] GC takes too much time when create large buffers
frequentely

 

It cannot be avoid sometimes. In my case, I need to build a new block of
binary data, then send to client. Even use C++ addon to build data, it's
still necessary to return a Buffer for network sending.

On Tuesday, November 13, 2012 6:20:43 PM UTC+8, chilts wrote:

On 13 November 2012 23:09, darcy <[email protected] <javascript:> >
wrote: 
> So, you should be careful with large buffers. 

Not just careful, but try to avoid them anyway. By streaming your info 
to/from disk, to/from requests or to/from external services you'll 
avoid having to use large buffers completely. It's probably a good 
practice to get into, rather than allocating large amounts of memory 
per request. 

Cheers, 
Andy 

-- 
Andrew Chilton 
e: [email protected] <javascript:>  
w: http://appsattic.com/ 
t: https://twitter.com/andychilton 

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to