Hi Gabriel,

> I played around with the rlreturnz stuff and it's nice and it works,
> but my main problem with gzip encoded content is that the browser waits
> until it's downloaded the entire file before it decompresses the data
> and begins rendering the page. A side-effect is that it makes pages
> appear to take longer to load, even though they take less time to
> download. Of course it depends on the content that was compressed in
> the first place, but for HTML pages it's a noticeable difference, which
> is why I never adopted the use of the gzip content encoding.

I think you win more than you loose, I have some statistics
from a gzip-module currently in development, with some good numbers:

[08/Jan/2003:20:12:00 +0100]    20598 Bytes     3819 Bytes      0.010000sec
[08/Jan/2003:20:12:38 +0100]    20598 Bytes     3819 Bytes      0.010000sec
[08/Jan/2003:20:13:13 +0100]    11336 Bytes     2479 Bytes      0.000000sec
[08/Jan/2003:20:13:53 +0100]    18025 Bytes     3131 Bytes      0.000000sec
[08/Jan/2003:20:13:54 +0100]    18017 Bytes     3133 Bytes      0.000000sec
[08/Jan/2003:20:13:57 +0100]    18642 Bytes     3503 Bytes      0.000000sec
[08/Jan/2003:20:14:04 +0100]    13407 Bytes     2748 Bytes      0.010000sec

The difference between compression level 5 and 9, e.g, results only in
about 150 bytes; rlreturnz.c uses 3, if I remember right. Using the highest
level and then caching it similar like the fastpath stuff would eliminate
thinking about processor load, too.
Compression IMHO is most of the time a big win for the vast majority of
web sites. I really don't "feel" that it takes longer to wait for the page.
I have some pages with very long result sets, 300k and more, that
can be compressed very good and take a fraction of the time loaded
and rendered than uncompressed.

Regards,
Bernd.

Reply via email to