At 04:54 15/3/2001, André Langhorst wrote:
>>Read:  You can now efficiently have your outgoing PHP/HTTP traffic 
>>compressed with good compression ratios without sacrificing memory.
>>Comments extremely welcome - and if people can also test it and let me 
>>know whether it works for them, it'd be great.
>
>Taking your example...
> >>ob_start("ob_gzhandler", 64)            10KB
>and if I got this right, The output handler should be called 10 times for 
>640kB file right? If this was the case the following script was supposed 
>to end in
>(100000*50(total output) / 128 (buffer size) ) * 3 (seconds for sleep)
>
>This is not the case!!! (at least on win32 again)
>
>$x=str_repeat('t',99996).'<BR>';
>function strlens($string)    {
>     //return str_replace("\n",'<br>',$string);
>     sleep(3);
>     return $string;
>      }
>
>ob_start('strlens',128);
>echo str_repeat($x,50);
>ob_end_flush();
>
>
>I tried to find some code to test if I get the stuff chunked or not, I do 
>know other way, is there any?

No, chunked output buffering doesn't assure you that the chunks will be 
exactly at the size you ask for.  The output handler will be invoked when 
there's at least 64 bytes in the buffer, but if there are more - it will be 
fed all of the bytes.  In your case, since you're generating 10K chunks 
each time, the buffer fills with a huge chunk on one output statement, so 
the output handler will be fed all of it at once.

Zeev


--
PHP Development Mailing List <http://www.php.net/>
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]

Reply via email to