Justin Wyllie wrote:
...
$file_handle->read($s, $length); #$s is about 1/2 Mb
@data = unpack($format , $s);
##at this point memory usage jumps by 8 Mbs (measured using GTop->size() )
while (@data) {
push @data2, [shift @data, shift @data, shift @data] ; # this isn't exact
but it looks like
>
> I tried undef'ing @data just before the return as it is no longer used
> but
> this only gained me 1/2 Mb. I would have expected to get all 8Mbs
> back. I
> don't understand why not.
>
Perl (as least on the OS's that I'm familiar with) doesn't release used
memory back to the OS.
Have a lo
Hi clint
Yes. Linux and this script looks good. We've think that part of the problem
is in the modules Apache is loading so this will be useful.
I also have another couple of questions:
I have found the errant code where our process jumps by 13 Mbs. One part
does something like this:
$file_ha
On Tue 29 Sep 2009, Clinton Gormley wrote:
> > I'm wondering if anyone can advise me on how I could go about
> > trying to understand where this 90 Mbs is comming from? Some of it
> > must be the mod_perl and apache binaries - but how much should they
> > be, and apart from the 6mb in shared memory
Hi Justin
>
> I'm wondering if anyone can advise me on how I could go about trying
> to understand where this 90 Mbs is comming from? Some of it must be
> the mod_perl and apache binaries - but how much should they be, and
> apart from the 6mb in shared memory for my pre-loaded modules, where
> i
On Tue 29 Sep 2009, Andreas Mock wrote:
> after searching for a while without finding an answer to my question
> I hope you can help me,
>
> We're using mod_perl with ModPerl::Registry to have only a thin
> layer between apache and our perl scripts. So far so good.
> Now we want to produce error do