I am talking about Rails standard page-caching mechanism. Rails by
default puts full pages into public/... and if mongrel sees them
there, it serves them (without running rails dispatch et al). This is
fine for normal but not good for the bot user agents.
Here is a new solution:
1) set rails to cache in public/cache
2) Use Apache rewrite to serve these files directly (if found)
3) If not found, pass to mongrel which will not find the cached files
either since MONGREL ONLY LOOKS IN public for cached files. Mongrel
does not honor the config.action_controller.page_cache_directory rails
setting
4) Rails processes the file and puts it into public/cache/...
...on the next request, apache serves from cache.
I am working on the reqwrite rules etc. for this.
Mike
On Jun 10, 2009, at 6:16 PM, m...@simtone.net wrote:
What kind of caches are you talking about?
Are these full page caches? The kind that get stored into /public?
My question is, can you instead of adding /robot to the URL when
apache
finds the robot, can you instead change Apache's DocumentRoot?
It seems to be that this would prevent apache from finding the cached
page. Also, if you actually point to another copy of your /public,
you
could get the normal static pages...
I think perhaps since you are talking about changing mongrel's caching
behaviour that you aren't talking about the page caches that get
stored
into /public. (Well, I'm rusty on terminology here)
--
Michael Richardson <m...@simtone.net>
Director -- Consumer Desktop Development, Simtone Corporation,
Ottawa, Canada
Personal: http://www.sandelman.ca/mcr/
SIMtone Corporation fundamentally transforms computing into simple,
secure, and very low-cost network-provisioned services pervasively
accessible by everyone. Learn more at www.simtone.net and www.SIMtoneVDU.com
_______________________________________________
Mongrel-users mailing list
Mongrel-users@rubyforge.org
http://rubyforge.org/mailman/listinfo/mongrel-users