On Aug 7, 2007, at 2:36 PM, Lonnie Olson wrote:

On Tue, 2007-08-07 at 10:53 -0600, Richard K Miller wrote:
Our web server has a couple dozen virtualhosts. A few have their own
robots.txt files, but for all the others I'd like to implement a
default robots.txt. Is there a way to do this in Apache instead of
copying or linking a file in every directory?

I tried putting this near the top of my httpd.conf but it didn't work:

RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule /robots.txt /www/public/robots.txt [L]

You are very close. mod_rewrite[1] doesn't map URLs to the file system,
it just rewrites the URL to another URL.  mod_alias[2] is the what can
do fun things with mapping URLs to the file system.

Thanks for pointing this out -- I didn't know.

###mod_rewrite
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
# Don't really need to verify robots.txt is a directory
RewriteRule /robots.txt http://masterhost.domain.com/robots.txt [L]

If I use "http://"; in the substitution string, it issues an HTTP redirect, which I don't want. (I want each robots.txt to be served from its own domain.)

###Or even easier with mod_alias
Alias /robots.txt /www/public/robots.txt

The Alias command works, with one exception: The file at /www/public/ robots.txt is always served, even if a VirtualHost has its own robots.txt file. Is there a way to do the equivalent of RewriteCond % {REQUEST_FILENAME} !-f with an Alias command?








_______________________________________________

UPHPU mailing list
[email protected]
http://uphpu.org/mailman/listinfo/uphpu
IRC: #uphpu on irc.freenode.net

Reply via email to