Just for the record: this topic contains 2 suggested solutions:
1) storing gzip compressed and uncompressed HTML separately and have Nginx
determine gzip support instead of the client
2) storing gzip permanently and use Nginx gunzip module to gunzip HTML for
browsers without gzip support
Posted
Hi Lucas,
Thanks a lot for the information! Hopefully it will help many others that
find the topic via Google as there was almost no information about it
available.
Posted at Nginx Forum:
https://forum.nginx.org/read.php?2,270604,270665#msg-270665
Hi Lucas,
Thanks a lot for the suggestion. We were already using that solution but a
strange behavior occurred (see opening post). The first request uses an
expected MD5 hash of the KEY, and the client will keep using that hash (the
MISS/HIT header is accurate). However, requests from other
Hi!
It sounds like a good solution to improve the performance, however, I just
read the following post by Jake Archibald (Google Chrome developer).
"Yeah, ~10% of BBC visitors don’t support gzip compression. It was higher
during the day (15-20%) but lower in the evenings and weekends (<10%).
Hi *B. R.*!
Thanks a lot for the reply and information! The KEY however, does not
contain different data from http_accept_encoding. When viewing the contents
of the cache file it contains the exact same KEY for both MD5 hashes. Also,
it does not matter what browser is used for the first request.
Hi!
I was wondering if anyone has an idea to serve pre-compressed (gzip) HTML
using proxy_cache / fastcgi_cache.
I tried a solution with a map of http_accept_encoding as part of the
fastcgi_cache_key with gzip compressed output from the script, but it
resulted into strange behavior (the MD5 hash