I have a IaaS cloud on DigitalOcean with 1gb ram and hosts 2 of my mobile
application backends in there.
I use bouncy to get the requests from port 80 and redirects it to the 2
apps.
I recently brought together both of these 2 apps together, they used to
work fine independently on 512 mb ram servers.
Both are just apis that does just one purpose most of the time. One app
being an xml searching app, searching 6 local xml files and outputs one.
Another app is just a static file server, serving about 1200 small images.
Here is the output for my free command.
total used free shared buffers cached
Mem: 1017956 939592 78364 24 54256 121336
-/+ buffers/cache: 764000 253956
Swap: 4194300 270660 3923640
There is only moderate traffic on both these apps, maybe about 2 req/sec
rate. You can see that my free memory is too low 78364. the used memory
grew from less than half of what is now, after i started the app.
After a day or even less, the apps going to fail, with one app giving me
"Evacualtion allocation failed, process out of memory" error, and the
networking in the ubuntu server fails. I have to reboot them from the
dashboard to get the server up again.
So i started printing the proccess.memoryusage and here it is:
App 1: { rss: 389242880, heapTotal: 58235904, heapUsed: 32149968 }
App 2: { rss: 89575424, heapTotal: 49980416, heapUsed: 27664696 }
Are you noticing a problem. I dont know much about memory debugging, can
you point out anything that is wrong to me.
I also tried node-memwatch, logging on 'stats' and 'leak'. The server is up
for a day, and so far I dont have a leak. Anyway, I will print the logged
stats now.
*App 1:*
{ num_full_gc: 1,
num_inc_gc: 1,
heap_compactions: 1,
usage_trend: 0,
estimated_base: 9375592,
current_base: 9375592,
min: 0,
max: 0 }
{ num_full_gc: 1,
num_inc_gc: 1,
heap_compactions: 1,
usage_trend: 0,
estimated_base: 9668888,
current_base: 9668888,
min: 0,
max: 0 }
{ num_full_gc: 1,
num_inc_gc: 1,
heap_compactions: 1,
usage_trend: 0,
estimated_base: 10864608,
current_base: 10864608,
min: 0,
max: 0 }
{ num_full_gc: 2,
num_inc_gc: 4,
heap_compactions: 2,
usage_trend: 0,
estimated_base: 15852832,
current_base: 15852832,
min: 0,
max: 0 }
{ num_full_gc: 3,
num_inc_gc: 16,
heap_compactions: 3,
usage_trend: 0,
estimated_base: 16457640,
current_base: 16457640,
min: 16457640,
max: 16457640 }
{ num_full_gc: 4,
num_inc_gc: 33,
heap_compactions: 4,
usage_trend: 0,
estimated_base: 16104624,
current_base: 16104624,
min: 16104624,
max: 16457640 }
*App 2:*
{ num_full_gc: 1,
num_inc_gc: 1,
heap_compactions: 1,
usage_trend: 0,
estimated_base: 7368408,
current_base: 7368408,
min: 0,
max: 0 }
{ num_full_gc: 2,
num_inc_gc: 1,
heap_compactions: 2,
usage_trend: 0,
estimated_base: 7528144,
current_base: 7528144,
min: 0,
max: 0 }
{ num_full_gc: 3,
num_inc_gc: 1,
heap_compactions: 3,
usage_trend: 0,
estimated_base: 7849616,
current_base: 7849616,
min: 7849616,
max: 7849616 }
{ num_full_gc: 4,
num_inc_gc: 1,
heap_compactions: 4,
usage_trend: 0,
estimated_base: 7894376,
current_base: 7894376,
min: 7849616,
max: 7894376 }
{ num_full_gc: 5,
num_inc_gc: 1,
heap_compactions: 5,
usage_trend: 0,
estimated_base: 7628840,
current_base: 7628840,
min: 7628840,
max: 7894376 }
{ num_full_gc: 6,
num_inc_gc: 1,
heap_compactions: 6,
usage_trend: 0,
estimated_base: 7307016,
current_base: 7307016,
min: 7307016,
max: 7894376 }
{ num_full_gc: 7,
num_inc_gc: 1,
heap_compactions: 7,
usage_trend: 0,
estimated_base: 7405248,
current_base: 7405248,
min: 7307016,
max: 7894376 }
{ num_full_gc: 8,
num_inc_gc: 1,
heap_compactions: 8,
usage_trend: 0,
estimated_base: 7441416,
current_base: 7441416,
min: 7307016,
max: 7894376 }
{ num_full_gc: 9,
num_inc_gc: 1,
heap_compactions: 9,
usage_trend: 0,
estimated_base: 7466384,
current_base: 7466384,
min: 7307016,
max: 7894376 }
{ num_full_gc: 10,
num_inc_gc: 1,
heap_compactions: 10,
usage_trend: 0,
estimated_base: 7471194,
current_base: 7514488,
min: 7307016,
max: 7894376 }
{ num_full_gc: 11,
num_inc_gc: 1,
heap_compactions: 11,
usage_trend: 0,
estimated_base: 7472706,
current_base: 7486320,
min: 7307016,
max: 7894376 }
There is more output in App 2, but I think this would do.
So far, no leak triggered.
But I am new to analysing memory dumps, can anyone help me out on this?
--
Job board: http://jobs.nodejs.org/
New group rules:
https://gist.github.com/othiym23/9886289#file-moderation-policy-md
Old group rules:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
---
You received this message because you are subscribed to the Google Groups
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/nodejs/3968728e-4dec-4591-b69a-0af2b52427d8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.