Re: memory usage problem!
Plamen Stojanov <[EMAIL PROTECTED]> writes: > I load 2Mb data from a database in perl hash and perl takes 15Mb > memory. As I use this under mod_perl - perl never returns this > memory to the OS. I must set a little number for MaxRequestsPerChild > in order to restart perl interpreter not to eat a lot of memory. Is > there any solution to avoid such memory usage? Simple: dont load this into memory. You can achieve this in multiple ways: - (if you really need to get this as hash) use DBM hash, mapping those resources to disk instead of memory - (if you can reconsider) use your database to sort the data according to your needs and just process it incrementally (for instance just scan record-after-record, do not keep the data in memory but just print them to the output socket) -- ( Marcin Kasperski | A reusable framework that is developed by itself will ) ( http://www.mk.w.pl |probably not be very reusable. (Martin)) () ( Dokument biznesowy w LaTeXu: http://www.mk.w.pl/porady/latex/mkofficial_cls)
Re: memory usage problem!
Hi there, On Tue, 8 Oct 2002, Plamen Stojanov wrote: > I have a ig mem usage problem with perl. > I load 2Mb data from a database in perl hash and perl takes 15Mb memory. This isn't a big memory problem unless you only have 32Mb RAM in which case you're going to run out of memory with mod_perl soon anyway. > Is there any solution to avoid such memory usage? Don't load the data into memory that way. There are also ways to share data between processes, check the archives for discussions on the topic - it gets discussed a lot. 73, Ged.
Re: memory usage problem!
Rodney Broom wrote: > From: Eric <[EMAIL PROTECTED]> > > >>What about in the case of a big query result? > > > I may have come into this thread a bit late, but can't you just undefine the storage >when you're done with it? > > $data = $sth->fetchall_arrayref; > #... do some stuff; > $data = undef; That will release the memory from that lexical so that perl can use it elsewhere, but it will not release it back to the OS. It's a good idea though, in any situation where you occasionally load large chunks of data into a scalar. (Not a good idea if you load large chunks of data into that scalar on most requests.) - Perrin
Re: memory usage problem!
From: Eric <[EMAIL PROTECTED]> > What about in the case of a big query result? I may have come into this thread a bit late, but can't you just undefine the storage when you're done with it? $data = $sth->fetchall_arrayref; #... do some stuff; $data = undef; --- Rodney Broom President, R.Broom Consulting http://www.rbroom.com/
Re: memory usage problem!
Eric wrote: > What about in the case of a big query result? That is where it seems > like you can get killed. Riding a bike without a helmet will get you killed; big query results are no problem. All you have to do is write your program so that it pages through results rather than loading them all into memory at once. The only time this becomes difficult is when you need to load a large BLOB fram the database. - Perrin
Re: memory usage problem!
Hi, What about in the case of a big query result? That is where it seems like you can get killed. I can see my processes grow very large in that case, and there is no way the memory will come back. But I certainly don't want to limit my processes because I might want to get a big result from a query sometimes. Thanks, Eric At 02:24 PM 2002-10-08 -0400, you wrote: >Also, try to find an alternative to loading all that data into memory. You >could put it in a dbm file or use Cache::FileCache. If you really have to >have it in memory, load it during startup.pl so that it will be shared >between processes. > >- Perrin > >Anthony E. wrote: >>look into Apache::Resource or Apache::SizeLimit which >>will allow you to set a maximum size for the apache >>process. >>both can be added to your startup.pl >>--- Plamen Stojanov <[EMAIL PROTECTED]> wrote: >> >>>Hi all, >>>I have a ig mem usage problem with perl. I load 2Mb data from a database >>>in perl hash and >>>perl takes 15Mb memory. As I use this under mod_perl - perl never >>>returns this >>>memory to the OS. I must set a little number for MaxRequestsPerChild in >>>order >>>to restart perl interpreter not to eat a lot of memory. Is there any >>>solution to avoid such memory usage?
Re: memory usage problem!
Also, try to find an alternative to loading all that data into memory. You could put it in a dbm file or use Cache::FileCache. If you really have to have it in memory, load it during startup.pl so that it will be shared between processes. - Perrin Anthony E. wrote: > look into Apache::Resource or Apache::SizeLimit which > will allow you to set a maximum size for the apache > process. > > both can be added to your startup.pl > > --- Plamen Stojanov <[EMAIL PROTECTED]> wrote: > >>Hi all, >>I have a ig mem usage problem with perl. >>I load 2Mb data from a database in perl hash and >>perl takes 15Mb memory. As I >>use this under mod_perl - perl never returns this >>memory to the OS. I must >>set a little number for MaxRequestsPerChild in order >>to restart perl >>interpreter not to eat a lot of memory. >>Is there any solution to avoid such memory usage?
Re: memory usage problem!
look into Apache::Resource or Apache::SizeLimit which will allow you to set a maximum size for the apache process. both can be added to your startup.pl --- Plamen Stojanov <[EMAIL PROTECTED]> wrote: > Hi all, > I have a ig mem usage problem with perl. > I load 2Mb data from a database in perl hash and > perl takes 15Mb memory. As I > use this under mod_perl - perl never returns this > memory to the OS. I must > set a little number for MaxRequestsPerChild in order > to restart perl > interpreter not to eat a lot of memory. > Is there any solution to avoid such memory usage? __ Do you Yahoo!? Faith Hill - Exclusive Performances, Videos & More http://faith.yahoo.com