I'm looking for suggestions, information, may be atricles about different memory usage issues concerned with mod_perl1.
On one Russian forum man posted an review of memory "leaks"(caveats):
sub handler { my $r = shift; $r->send_http_header('text/plain'); print "Hello, World!\n"; print ttt(); return OK; } sub ttt { my $huge_text = 'x' x 1000000; my $huge_text = $huge_text . "abc" . "bcd"; return 1; }
1) Restart Apache
2) Start `top`
3) Request.
4) Apache eats ~4MB instead of 1....
I know that mod_perl(or may be perl) don't free memory of $huge_string because most likely it will be needed in next call, but also it take another MBs for concatenation. Why?
Above example uses ~4MB of memory and until child restart apache don't free memory.
But if I want to prevent huge memory usage I can add undef($huge_text); just before return, but undef force freeing of only 1MB. ~3MB leaks and make me use MaxRequestsPerChild... :(
It's an hypothetical example, but now we use huge system(RT from Best Practical) and without limiting requests per child we fall in apache lock down.
How to debug this situations. I want to find way how to change some calls that allow perl don't allocate additional memory. I'm looking for approaches of debuging such leaks.
Best regards. Ruslan.