On Tue, 2007-08-07 at 10:53 -0600, Richard K Miller wrote:
> Our web server has a couple dozen virtualhosts. A few have their own  
> robots.txt files, but for all the others I'd like to implement a  
> default robots.txt. Is there a way to do this in Apache instead of  
> copying or linking a file in every directory?
> 
> I tried putting this near the top of my httpd.conf but it didn't work:
> 
> RewriteEngine On
> RewriteCond %{REQUEST_FILENAME} !-f
> RewriteCond %{REQUEST_FILENAME} !-d
> RewriteRule /robots.txt /www/public/robots.txt [L]

You are very close.  mod_rewrite[1] doesn't map URLs to the file system,
it just rewrites the URL to another URL.  mod_alias[2] is the what can
do fun things with mapping URLs to the file system.

Your idea with the error document can do cool things, but will require
some advanced *-fu.

Here are some examples that should work.

###mod_rewrite
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
# Don't really need to verify robots.txt is a directory
RewriteRule /robots.txt http://masterhost.domain.com/robots.txt [L]

###Or even easier with mod_alias
Alias /robots.txt /www/public/robots.txt
# This can go in a .htaccess in the root of a virtualhost
#  or inside the <VirtualHost> directive to make it specific to a site.

[1] http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html
[2] http://httpd.apache.org/docs/2.0/mod/mod_alias.html


_______________________________________________

UPHPU mailing list
[email protected]
http://uphpu.org/mailman/listinfo/uphpu
IRC: #uphpu on irc.freenode.net

Reply via email to