> I'm serving city maps. While the number of output pages is circa
> infinite for all practical purposes, they accumulate around the same
> places.

The company that does this in the UK appears to want their pages to 
be forced dynamic and generate a new serial numbered page for each 
access.  This seems to be to stop people bookmarking the page or 
linking to it and thus bypassing the adverts.  In fact, we originally 
had a problem with a CERN cache in that they were making the maps 
cacheable and we ended up accumulating large numbers of identitcal 
GIFs; they subsequently fixed this.

> handler simulates what you are suggesting. I eliminate the
> questionmark and suddenly all ignorant cacheservers cache me.

The reason for this is because there are a lot more ignorant CGI 
script writers who have misused GET mode forms for applications which 
do not behave as pure functions (the returned page only depends on 
the parameters, not past history, and there are no side effects).  As 
a result, I believe that the HTTP and/or HTML specifications now 
consider it unsafe to cache any forms URL, although the HTTP 
specification does allow this to be overridden by including an 
explicit expiration time (section 13.9).

A lot of things that are done on the web are to work around 
entrenched abuses.

Note this may fail to post to the Perl/Apache list.


-- 
David Woolley - Office: David Woolley <[EMAIL PROTECTED]>
BTS             Home: <[EMAIL PROTECTED]>
Wallington      TQ 2887 6421
England         51  21' 44" N,  00  09' 01" W (WGS 84)

Reply via email to