[EMAIL PROTECTED] (darren chamberlain) wrote:
>Luis,
>
>Write a handler (or cgi script, or registry script, or NSAPI plugin, or
>PHP page) that handles 404 Errors, generates the (static) page, and
>writes it to the location in the file system where the requested page
>should live. The next time it is called, it will be treated like any
>other HTML file request. The fastest way to cache pages is to have them
>be regular HTML.

Or have them be resident in memory, which squid can do.  Why reinvent this?

>Another option is to set up whatever handler you want, on a development
>or staging server (i.e., not the live one), and grab the pages with
>lynx -dump or GET or an LWP script, and write them to the proper places
>in the filesystem where the live server can access them. With a little
>planning, this can be incorporated into a cron job that runs nightly
>(or hourly, whatever) for stuff that is updated regularly but is
>composed of discernable chunks.

I've used this before and it works well.  One disadvantage is that Luis
would have to move all his existing scripts to different places, and fix
all the file-path things that might break as a result.  Seems like a
front-end cache like squid is a better solution when Luis says he wants
a cache on the front end.

Putting squid in front of an Apache server used to be very popular - has
it fallen out of favor?  Most of the answers given in this thread seem
to be more of the roll-your-own-cache variety.


  -------------------                            -------------------
  Ken Williams                             Last Bastion of Euclidity
  [EMAIL PROTECTED]                            The Math Forum


Reply via email to