Hi,

I can see this is a very complex topic, that would require me to delve into Perl C surce code in order to really understand what's going on behind the scenes.

So, for now, I'll limit myself to these strategies (and please tell me if I'm wrong): -I'll try avoid using large chunks of data so that interpreters memory footprint don't grow too big. -I'll try to use, in my Perl scripts, lexical variables instead of global variables, so that I'm sure the used memory can be resued for later requests (global variables, on the other hand, stick in memory, due to mod perl way of operating)

I found an interesting topic here:
http://www.nntp.perl.org/group/perl.perl5.porters/2006/03/msg111095.html

Lionel.

----- Original Message ----- From: "Perrin Harkins" <[EMAIL PROTECTED]>
To: "Lionel MARTIN" <[EMAIL PROTECTED]>
Cc: <modperl@perl.apache.org>
Sent: Thursday, May 10, 2007 7:57 PM
Subject: Re: After retrieving data from DB, the memory doesn't seem to be freed up


On 5/10/07, Lionel MARTIN <[EMAIL PROTECTED]> wrote:
On Windows XP, it seems to me that Perl never gves back memory to the OS,
even after I undef variables or after they go out of scope.

That's pretty common.  Perl will be able to use the memory for other
variables though.

But then, I'm wondering why and how comes, in the example I was giving
below, after $sth->finish, I could see that the Perl.exe process was
shrinked and used less memory? (meaning that $sth->finish made Perl renders
the memory to the OS)

Some of this is dependent on your OS and compiler.  I can't tell you
why it sometimes gives memory back to the OS.  I can tell you not to
count on it.

Are you saying this because, for example, if a Perl interpreter is using a
100Megs buffer to read a file, after the file is read, even if the memory
can be used again by the Perl interpreter (which means we are not talking
about memory leak here), it will never be given back to other
processes/interpreters?

Yes.

1) When a variable is undefined or goes out of scope, can I be sure that the memory that was used by it is straight away rendered to Perl so that it can
use it for other variables?

No.  Those are two different things.  If you explicitly undef it, the
memory gets handed back to Perl:

undef $foo;

If it just goes out of scope, the memory stays allocated to that variable.

2) If I have a reference to a big array, like:
 $tmp = [1..1000000];

Does a :
$tmp = 1; or a $tmp = undef; or a $#$tmp = -1;
gives the memory back to Perl so that it can use it for other purposes?

$tmp is just a reference, and doesn't take much memory at all.  I'm
not sure how you can clear memory that might be allocated to the
anonymous array, or exactly what perl will do with it when the array
goes out of scope.  You can ask on p5p or perlmonks.org if you're
really interested.

Clearing an array is something like this:
@array = ();

If I am asking this, it is specifically to know if, when using large amounts of data, I should undef if possible the variables before allocating others,
so that processes don't grow too big?

No, you shouldn't.  That would be a painful way to code.  Instead, you
should structure your program so that it never loads large amounts of
data into memory.

- Perrin


Reply via email to