On Wed, Jun 2, 2010 at 3:29 AM, Jeffrey E Burgoyne <burgo...@keenuh.com> wrote:
> 4 - 1 compression ratio is fine unless you are serving lots of rich
> content, which generally will see no performance gain if not reduced
> performance.

The rich content doesn't need to go through the deflate filter,
though, so you needn't (and shouldn't) incur any CPU overhead for it.

The approach I've used successfully on large sites in the past is to
do on-the-fly compression when:

1. The response content-type is HTML, CSS, or JavaScript (plus plain
text and XML, for sites serving up those in nontrivial amounts)

2. and the request header contains an Accept-Encoding indicating that
the client can handle gzip

3. and the User-Agent is not one of the ones with known problems
handling compressed content (IE <=5 and Netscape <=4, if I remember
correctly)

The only thing stopping me from suggesting this as a default is the
concern that others have noted regarding downstream caches.  (I think
compression probably ought to be a Transfer-Encoding rather than a
Content-Encoding, but that's a whole other debate entirely.)

-Brian

Reply via email to