? 2013/6/26 2:42, alessioalex ??:
@Eldar: I know, but for the sake of brevity I omitted that.
@mscdex: it has little to do with http, I made a simpler example with
a loop here:
https://gist.github.com/alessioalex/5861041 ( you can clone this )
And then run node --expose-gc loop.js ( the garbage collector will run
each 5 seconds )
Note: I've included a sample.txt 2 Mb file.
Also, I've tested on Node 0.8.x and 0.10.x and it behaves the same,
after the gc runs the memory idles at > 100 Mb, which is really odd.
This is the code from the gist:
loop.js
// run with: node --expose-gc loop.js
var fs = require('fs');
// 2 Mb file
var fpath = __dirname + '/sample.txt';
function readStream() {
fs.createReadStream(fpath).on('open', function() {
process.stdout.write('o');
}).on('data', function() {
process.stdout.write('*');
}).on('end', function() {
process.stdout.write('$');
});
}
for (var i = 0; i < 180; i++) {
readStream();
}
function getMem(msg) {
memUsg = (process.memoryUsage().rss / (1024 * 1024)).toFixed(2);
console.log('[' + new Date() + '] rss: ' + memUsg + ' Mb');
}
var garbageCollect = gc || function() {};
setInterval(function() {
garbageCollect();
getMem();
}, 5000);
On Tuesday, June 25, 2013 4:39:56 PM UTC+3, alessioalex wrote:
Hello there,
I'm monitoring my app and I've seen my memory usage increase
really oddly (it did not decrease after X hours), so I suspected a
leak. I've found out that the memory seems to remain uncollected
when using fs.readStream. I made a small example with an http
server that serving a 2mb file and did some ab (apache benchmark)
load testing on it (ab -n 5000 -c 100 a couple of times). X
minutes after the load testing is done, the memory idles at 480 Mb
and doesn't drop.
Am I missing something or is there really a problem with this?
index.js
var http = require('http'),
bytes = require('bytes'), // npm install bytes
fs = require('fs');
var fpath = '/path/to/a/2mb/file/in/my/case';
http.createServer(function(req, res) {
fs.createReadStream(fpath).pipe(res);
}).listen(7777);
var lastMem;
function getMem(msg) {
var memUsg;
msg = msg || '';
memUsg = bytes(process.memoryUsage().rss);
if (lastMem !== memUsg) {
lastMem = memUsg;
console.log(msg + ' ' + memUsg);
}
}
setInterval(function() {
getMem('rss:');
}, 5000);
Thanks!
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
---
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.
Interesting. I have about 50MB memory reported at last and 150MB
reported when loop 500 times. When tried 5000, error poped out saying
EMFILE.
However, I made heapdump before clone streams and after. It shows almost
the same (still little different, but my chrome cannot make comparison)
no matter 180 times or 500 times. Don't know why.
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
---
You received this message because you are subscribed to the Google Groups "nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.