Hi Perrin,

Thanks for your reply.

On Windows XP, it seems to me that Perl never gves back memory to the OS, even after I undef variables or after they go out of scope.

For example, I can see it with this code:
use strict;
print "Step 1\n";
<STDIN>;
my $x = 'a'x10**7;
print "Step 2\n";
<STDIN>;
$x = undef;
print "Step 3\n";
<STDIN>;

which gives:

Step 1 > 1800k
Step 2 > 21412k
Step 3 > 21416k  #no memory given back to the OS here

But then, I'm wondering why and how comes, in the example I was giving below, after $sth->finish, I could see that the Perl.exe process was shrinked and used less memory? (meaning that $sth->finish made Perl renders the memory to the OS)

What this means for you is that you should never do things like read
an entire large file into a scalar, since that will permanently
increase the size of that apache process.  There's a lot of advice
about this here:
http://modperlbook.org/html/ch14_02.html

Are you saying this because, for example, if a Perl interpreter is using a 100Megs buffer to read a file, after the file is read, even if the memory can be used again by the Perl interpreter (which means we are not talking about memory leak here), it will never be given back to other processes/interpreters? (I'm saying interpreters, because on Windows, all Perl interpreters are running withinin the same Apache child process)

I have got two more questions:

1) When a variable is undefined or goes out of scope, can I be sure that the memory that was used by it is straight away rendered to Perl so that it can use it for other variables? Or does the garbage collection process runs "sometimes later on, but we don't know when"? (or better said: is the garbage colllection process synchronous or asynchronous?)

2) If I have a reference to a big array, like:
$tmp = [1..1000000];

Does a :
$tmp = 1; or a $tmp = undef; or a $#$tmp = -1;
gives the memory back to Perl so that it can use it for other purposes?

If I am asking this, it is specifically to know if, when using large amounts of data, I should undef if possible the variables before allocating others, so that processes don't grow too big?

Thanks,

Lionel.



----- Original Message ----- From: "Perrin Harkins" <[EMAIL PROTECTED]>
To: "Lionel MARTIN" <[EMAIL PROTECTED]>
Cc: <modperl@perl.apache.org>
Sent: Thursday, May 10, 2007 4:17 PM
Subject: Re: After retrieving data from DB, the memory doesn't seem to be freed up


On 5/10/07, Lionel MARTIN <[EMAIL PROTECTED]> wrote:
I suspect that this is coming from data retrieved from my DB not being well
freed.

That's right.  There are two things you need to understand here.

The first is that Perl doesn't free memory from variables unless you
tell it to, even when they go out of scope.  This is an optimization,
and it works in your favor if you use the memory again.  You can free
the memory from a variable by undef'ing it, but that just frees it up
for Perl to use elsewhere in the current process.  (This is somewhat
operating system specific.  You may see processes shrink on some
systems after freeing memory, but not always.)

What this means for you is that you should never do things like read
an entire large file into a scalar, since that will permanently
increase the size of that apache process.  There's a lot of advice
about this here:
http://modperlbook.org/html/ch14_02.html

The second is that the MySQL client library will load the entire
result set into your process unless you tell it not to.  This is done
with the mysql_use_result option, discussed in the MySQL docs and also
here:
http://modperlbook.org/html/ch20_02.html

- Perrin


Reply via email to