Actually, it is a bit more complicated. I was wrong in my assumptions. To 
be more blunt, I was wrong period. :)

I had been testing some stuff using global variables and experienced this 
behavior of non-releasing RAM. Since then a few people privately showed me 
otherwise and Stas' posted a thread pointing to a discussion on this issue 
from a past-mod_perl list discussion.

It is still the case that one needs to be careful with how variables grow 
in mod_perl, but it is not entirely the case that the RAM is always not freed.

Here is the URL that Stas posted which makes some of these persistent 
server memory issues more clear (perhaps even for a growing Java server).

http://forum.swarthmore.edu/epigone/modperl/zarwhegerd

Later,
     Gunther

At 11:19 AM 4/19/00 +0000, [EMAIL PROTECTED] wrote:
>On Tue, Apr 18, 2000 at 01:24:16PM +0800, Gunther Birznieks wrote:
> > If you aren't careful with your programming, an apache HTTPD can always
> > grow pretty quickly because Perl never releases the RAM it allocates
> > previously. While it does that reference count garbage collection, that is
> > internal to the RAM that was allocated.
> >
> > Let's say you need to sort a record set returned from a DBI call in an
> > unusual perl-like way. If you do this "in memory", you need an array to
> > hold the entire recordset in memory at once. If you do this, though, you
> > will allocate the RAM for that one request that sorted the array and then
> > the HTTPD will remain that size forever.
> >
> > Keeping the higher RAM allocation is good for performance if you have the
> > RAM of course. So this is one of those design tradeoffs. And Perl was not
> > really written to be a persistent language, so again, the tradeoff of
> > operational speed seems to make sense versus persistent memory usage.
> >
> > Later,
> >    Gunther
> >
>
>Gunther,
>
>Curiosity leads me to the following question...:
>
>So what your talking about is lets say a variable becomes 40k large,
>or bigger.  Since we're talking about a pretty big operation we could
>even be talking in terms of several 100k, but anyway:
>
>That variable would retain it's size throughout the persistence of the
>perl interpretor, correct?  And that memory would be specific to that
>variable?  Hm.. okay, that's where I was getting messed up.  The
>variables value is lost after the block end, but it's size is never
>realloc'd down to something more appropriate?  That's an interesting
>problem in and of itself.  So if you were to do something like this:
>
>$i=20;
>$bigvar="something thats 40k long";
>somememoryhog($bigvar);
>sub somememoryhog {
>         my $var=shift;
>         somemoryhog($var) if($i-->=0);
>}
>
>It would call some memory hog 20 times, each time it would copy the
>value of $bigvar onto the next level down of the recursive stack of
>somememoryhog.  The total memory usage would be 20*40k=800k, and it
>would never re allocate that variable down to a reasonable size?
>That's the behaviour I thought that would happen, but I was thinking
>the value would be retained through the stack (clearly my error).
>(Okay, so sue me it would call somememory hog more than 20 times, I'm
>just trying to clear up something :->)
>
>Thanks,
>Shane.

__________________________________________________
Gunther Birznieks ([EMAIL PROTECTED])
Extropia - The Web Technology Company
http://www.extropia.com/

Reply via email to