Hi,

Apologies for it not being in the correct format. 

Below is a small section of code that I'm having problems with.
Essentially, it puts all the data into a big buffer, and when the buffer
hits 500K, it should stop adding data to the buffer, print the buffer
out, and decline all further data, so as to pass the data through
unmodified.

If I try and download a 650MB file through it, it initially does what is
expected and fills the buffer up to 500K. This can be observed by
looking at /tmp/BigFileTest.result. However, if I then look at the
memory usage of httpd, it continues to grow - eventually to stupidly
huge sizes. The memory usage by far exceeds the amount of data that was
downloaded. In addition to this, if I then cancel my download, the httpd
process appears to carry on running, and doesn't free off any memory. It
still consumes more and more memory, even though the client cancelled
the download.

I have also tried removing the code that deals with buffers completely,
so essentially all it does is reads 1024 bytes, then declines the data,
and the same problem occurs.

I am using the Apache 2.0.47 and the latest version of mod_perl.

Sorry if this is me being dumb, but I really am stuck here.

========================================================================

httpd.conf Extract

Listen 8081

LoadModule perl_module modules/mod_perl.so
PerlModule Apache2
PerlModule Test::BigFileTest

<VirtualHost *:8081>
        ServerName my_server_name:8081
        ServerAdmin [EMAIL PROTECTED]
        ProxyRequests On
        ProxyRemote * http://corporate_proxy_server:8080
        <Proxy *>
                Order Allow,Deny
                Allow from all
                PerlOutputFilterHandler Test::BigFileTest
        </Proxy>
</VirtualHost>

========================================================================

#file:Test/BigFileTest.pm
#---------------------------
#
# Version: 0.1.001
# Author: Chris Pringle
# Date of last update: 09/12/2003
# Copyright (C) 2003 Hewlett Packard Limited
#

package Test::BigFileTest;

use strict;
use warnings;

use Apache::Filter();
use Apache::RequestRec();
use Apache::Connection();
use APR::Table();

use Apache::Const -compile => qw(OK);
use Apache::Const -compile => qw(DECLINED);

use constant BUFF_LEN => 1024;

sub handler 
{
        # Get the filter object
        my($f,$bb) = @_;

        my $c = $f->c;

        # Declare a buffer
        my($buffer) = "";
        my($scratch) = "";

        # Only done on the FIRST pass of the filter
        unless($f->ctx)
        {
                $f->r->headers_out->unset('Content-Length');
                $f->ctx(1);

                # Ensure there is a buffer variable in the filter
context
                $f->ctx({buffer => ''});
        }


        # Get buffer from ctx - will contain buffer from previous
buckets 
        $buffer = $f->ctx->{buffer};

        # Read as much data as there is available and put it into
$buffer
        while($f->read($scratch, BUFF_LEN))
        {
                # If buffer > 500K print buffer and stop
                if(length($buffer) > 500000)
                {
                        $f->print($buffer);
                        return Apache::DECLINED;
                }
                else
                {
                        $buffer = $buffer . $scratch;
                }
                open(FD, ">>/tmp/BigFileTest.result");
                print FD "Buffer size is " . length($buffer) ."\n";
                close(FD);
        }


        # If this is the last bucket
        if($f->seen_eos)
        {
                #
                # Do some filtering here ....
                #
                #

                $f->print($buffer);
        }
        $f->ctx({ buffer => $buffer});

        return Apache::OK;
        
} # handler
1;


---
Regards,
Chris Pringle

UK PSG
Hewlett-Packard, Bristol
Tel: +44 117 31 29664


> -----Original Message-----
> From: Stas Bekman [mailto:[EMAIL PROTECTED]
> Sent: 08 December 2003 19:04
> To: Pringle, Chris (HP-PSG)
> Cc: [EMAIL PROTECTED]
> Subject: Re: Memory Leak
> 
> 
> Pringle, Chris (HP-PSG) wrote:
> > Hi All,
> > 
> > Anyone experience problems with filters and large files?
> > 
> > I tried to download a 650MB ISO image through my proxy with
> a filter
> > enabled and it caused the box to run out of memory, thrash and
> > eventually panic.
> > 
> > Its a stream based output filter that sits in a loop doing
> > $f->read(...). Even when I do a return Apache::DECLINED to 
> after every
> > read, I still get massive memory usage eventually leading
> to a crash.
> > 
> > Any ideas?
> 
> Chris, in the future please always report bugs following the
> guidelines at: http://perl.apache.org/bugs/.
> 
> I assume that you are talking about mp2 filters as you mention
> Apache::DECLINED, which is supposed to pass the data through 
> unmodified. How 
> do you send the data through? Does the filter read the data 
> from default 
> apache handler or do you feed it with a mod_perl response handler?
> 
> __________________________________________________________________
> Stas Bekman            JAm_pH ------> Just Another mod_perl Hacker
> http://stason.org/     mod_perl Guide ---> http://perl.apache.org
> mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
> http://modperlbook.org http://apache.org   http://ticketmaster.com
> 
> 

--
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html

Reply via email to