On Tue, 22 Nov 2011 11:57:15 -0500, Doug Karl wrote:
We are trying to configure Squid for installation in school labs in
Belize, Central America where the Internet routinely goes down for
several minutes and sometimes an hour at a time. We are very happy
serve up stales pages to the children for their classroom session. So
we need to either: (1) Configure squid to handle such situations
cached pages are simply served stale when the Internet is down (i.e.
don't have Internet to verify freshness) or (2) Have squid respond to
a script that detects the Internet to be down telling it to serve up
stales pages when Internet down. As configured, our Squid
implementation will not serve stale pages because it tries to access
the original Web site and the cached pages are not served up at all.
NOTE: We have tried "Squid Off-line mode" and that does not work as
you would think as several others reported. SO ARE THERE config
parameters that can make caching work in the presence of bad Internet
Yes and no.
The key directive _is_ offline_mode. Although the confusing bit is that
the mode must be always set to ON for situations like yours. Don't
toggle it on/off. All it does is expand the type of things Squid caches
to include some which would normally be discarded immediately. It
prepares the cache content as best as possible for the second
max_stale - once items already in cache (via offline_mode), this
controls how long they may be served for after the Internet connection
Also there are refresh_pattern max-stale=N options in Squid 2.7 and
3.2 to provide per-URL staleness control.
Also HTTP responses from websites can contain max-stale controls
telling your Squid its safe to cache and serve while stale.
Note that all of this is determined by the cacheability of the site
objects from the start. If an object is not safe to cache and re-use the
page which depends on it will break in some way while offline. A lot of
sites webmasters do not send cache friendly header and so create sites
which break very easily.