I've developed an Apache::Dynagzip handler for Outlook Technologies, Inc. to serve the dynamic content with the option to control the size of the chunk(s). It works fine (as standing along, and within the Apache::Filter Chain). Using the Apache::Dynagzip you have several options to control your chunk size(s). You can even control the size of every chunk from the source generator: Just send the mask of the end of the chunk to the Apache::Dynagzip within the outgoing content/stream. (You might wish to send your header to the client browser while creating the rest of the body...) Otherwise, it is buffering the outgoing stream up to the length of the chunk's minimum size , which is not less then declared (2K default).
Is your handler Apache::Filter Chain compatible? Thanks, Slava Bizyayev ----- Original Message ----- From: "Igor Sysoev" <[EMAIL PROTECTED]> To: "Nicholas Oxhoj" <[EMAIL PROTECTED]> Cc: <[EMAIL PROTECTED]> Sent: Tuesday, February 19, 2002 4:12 AM Subject: Re: "Streaming" compression of output from mod_perl handler? > On Tue, 19 Feb 2002, [iso-8859-1] Nicholas Oxhøj wrote: > > > I am looking for an Apache module which will allow me to compress the output of my mod_perl handler (a "native" handler, i.e. not running under Apache::Registry). But since my handler can potentially take a long time to finish, the output has to be compressed in a "streaming" fashion (or in small blocks) so that the browser will start receiving data before my handler has completely finished. > > > > I have been experimenting with all the different Apache compression modules I have been able to find, but have not been able to get the desired result. I have tried Apache::GzipChain, Apache::Compress, mod_gzip and mod_deflate, with different results. One I cannot get to work at all. Most work, but seem to collect all the output before compressing it and sending it to the browser. > > > > There also seems to be an issue about the new HTTP/1.1 "chunked" transfer-encoding. For instance, mod_gzip will not compress chunked output, unless you allow it to "dechunk" it by collecting all the output and compressing it as one big block. > > > > So I am basically looking for anyone who has had any success in achieving this kind of "streaming" compression, who could direct me at an appropriate Apache module. > > What mod_deflate did you try ? My or Apache 2.0 ? > I can comment my mod_deflate. > First, mod_deflate did not collect all output before compressing - it > compress it on the fly. But it emits compressed content in 8K block. > It's Apache's HUGE_STRING_LEN #define and it can be changed in sources. > Besides if some module flushes output then mod_deflate would flushed it too. > > mod_deflate did not have problems with chunked transfer-encoding > because it compress content before Apache start to make chunks. > mod_deflate remove Content-Length header so compressed content > would be sent to client chunked (HTTP/1.1) or not (HTTP/1.0). > > Igor Sysoev > >