Hi,

What about in the case of a big query result? That is where it seems like 
you can get killed.
I can see my processes grow very large in that case, and there is no way 
the memory will come back. But I certainly don't want to limit my processes 
because I might want to get a big result from a query sometimes.

Thanks,

Eric

At 02:24 PM 2002-10-08 -0400, you wrote:
>Also, try to find an alternative to loading all that data into memory. You 
>could put it in a dbm file or use Cache::FileCache.  If you really have to 
>have it in memory, load it during startup.pl so that it will be shared 
>between processes.
>
>- Perrin
>
>Anthony E. wrote:
>>look into Apache::Resource or Apache::SizeLimit which
>>will allow you to set a maximum size for the apache
>>process.
>>both can be added to your startup.pl
>>--- Plamen Stojanov <[EMAIL PROTECTED]> wrote:
>>
>>>Hi all,
>>>I have a ig mem usage problem with perl. I load 2Mb data from a database 
>>>in perl hash and
>>>perl takes 15Mb memory. As I use this under mod_perl - perl never 
>>>returns this
>>>memory to the OS. I must set a little number for MaxRequestsPerChild in 
>>>order
>>>to restart perl interpreter not to eat a lot of memory. Is there any 
>>>solution to avoid such memory usage?

Reply via email to