Hello,
Thanks for the suggestion, but it doesn't seem to make any difference.

I tried setting:
ProxyIOBufferSize 32768
ProxyReceiveBufferSize 32768

in my httpd.conf, and it is still calling my handler several times per 
request...

I put in:
warn "Size: " . length($buffer) . "\n";

in my while ($filter->read) loop and get the following for a single page 
(page is ~11k):

Size: 1101
Size: 3109
Size: 987
Size: 4096
Size: 1697

(Before I increased my buffer size in the read it would break down the 
larger of the above into further chunks.)

I think the best way would be to somehow determine where the actual end of 
the document is to call $p->eof;. Because even if increasing the various 
buffers worked, I don't want to make them insanely large, but I could end 
up having pages larger than the buffer, which would leave me with problems 
again. I'd rather not use a solution like looking for </html> as I need to 
use this for .css and other non-html files. Also, some of the proxied 
documents use SSI and may contain multiple instances of </HTML>. (I tested 
it by checking for </html> and then calling $p->eof; and it does solve the 
problem, but as I explained this is not an ideal solution.)

At 11:34 PM 5/6/2002 +0200, pascal barbedor wrote:
 > hi
 >
 > you could maybe set the ProxyIOBufferSize
 > or  Proxyreceivebuffersize
 > in the front end server so that response from modperl server would not be
 > chunked but one shot
 >
 > also static ressources like gif in server B documents could be retrieved
 > from server A only with an alias not proxied to server B
 >
 >
 > pascal
 >
 >
 > ----- Original Message -----
 > From: "Douglas Younger" <[EMAIL PROTECTED]>
 > To: <[EMAIL PROTECTED]>
 > Sent: Monday, May 06, 2002 10:26 PM
 > Subject: How do I determine end of request? (mod_perl 2.0)
 >
 >
 > > Hello,
 > >    I'm fairly new to using mod_perl. I've been able to find lots of
 > > resources dealing with mod_perl 1.x, but the documentation for 2.0 is
 > > rather sparse.
 > >
 > > I'm pretty sure what I need to do can only be handled by Apache 2.0 &
thus
 > > I'm forced to use mod_perl 2.0... (well 1.99)
 > >
 > > I'm trying to proxy ServerB through ServerA... ok that's simple enough
 > with
 > > mod_proxy. However, links, embedded images, etc in the proxied document
 > end
 > > up broken if they are non-relative links (ie. start with a slash).
 > >
 > > Example: on ServerB is a document say: /sales/products.html
 > > in products.html it links to /images/logo.gif
 > > accessing /sales/products.html using ServerB everything is fine. But, if
I
 > > want to proxy ServerB via ServerA... say
 > > ProxyPass /EXTERNAL http://ServerB
 > >
 > > If I goto http://ServerA/EXTERNAL/sales/products.html the embedded image
 > > /images/logo.gif is requested from ServerA.
 > >
 > > So to handle this I wanted to write a filer for ServerA to parse all
pages
 > > served via Location /EXTERNAL and "fix" the links.
 > >
 > > I wrote a handler (see below) using HTML::Parser to extract the tags
that
 > > would contain links and process them.
 > >
 > > It works great for the most part... however, it seems like instead of
 > > ServerA getting the entire output from ServerB, it gets it in
 > > chunks   which get processed individually. This causes my handler to
fail
 > > when a tag is split between 2 chunks.
 > >
 > > What I think needs to be done is to build up the document in a variable
 > > $html .= $buffer; and then call the $p->$parse($html) once the entire
 > > document has been received by ServerA (or maybe as simple of only
calling
 > > $p->eof; at that point).
 > >
 > > Or is there a better way to do this? One problem I've found so far is I
 > > need to fix style sheets, but I can probably write a special handler for
 > > them once I get this problem fixed.
 > >
 > > Thanks!
 > >

Reply via email to