Our web server has a couple dozen virtualhosts. A few have their own robots.txt files, but for all the others I'd like to implement a default robots.txt. Is there a way to do this in Apache instead of copying or linking a file in every directory?

I tried putting this near the top of my httpd.conf but it didn't work:

RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule /robots.txt /www/public/robots.txt [L]

Another way to think about it would be "if you get a 404 error and the requested file was robots.txt, serve up the default robots.txt file" but I'm not sure how to translate that into Apache-speak.

Richard



_______________________________________________

UPHPU mailing list
[email protected]
http://uphpu.org/mailman/listinfo/uphpu
IRC: #uphpu on irc.freenode.net

Reply via email to