What kind of caches are you talking about?
Are these full page caches?  The kind that get stored into /public?

My question is, can you instead of adding /robot to the URL when apache
finds the robot, can you instead change Apache's DocumentRoot?
It seems to be that this would prevent apache from finding the cached
page.  Also, if you actually point to another copy of your /public, you
could get the normal static pages...

I think perhaps since you are talking about changing mongrel's caching
behaviour that you aren't talking about the page caches that get stored
into /public. (Well, I'm rusty on terminology here)

-- 
Michael Richardson <m...@simtone.net>
Director -- Consumer Desktop Development, Simtone Corporation, Ottawa, Canada
Personal: http://www.sandelman.ca/mcr/ 

SIMtone Corporation fundamentally transforms computing into simple,
secure, and very low-cost network-provisioned services pervasively
accessible by everyone.  Learn more at www.simtone.net and www.SIMtoneVDU.com 


_______________________________________________
Mongrel-users mailing list
Mongrel-users@rubyforge.org
http://rubyforge.org/mailman/listinfo/mongrel-users

Reply via email to