> a.sm...@ukgrid.net wrote:
> > Hi,
> > 
> >   I'd like to have a robots.txt on a site that has the following apache 
> > httpd config:
> > 
> >   <Location />
> >      SetHandler perl-script
> >      PerlHandler RT::Mason
> >     </Location>
> > 
> > But if I install a robots.txt to the DocumentRoot and test it via wget I 
> > just download the front page of the site, as its handled by perl-script. 
> > It it possible to have a robots.txt in this situation?
> > 
> > thanks for any tips, Andy.
> > 
> Ideas :
> 1) Try a <FilesMatch ^robots\.txt$> section inside the above section, to 
> reset the handler 
> to the default Apache (that may not be so easy)

        It may be easier to have the files match only the needed suffixes 
(e.g., .pl, .perl, .html, etc.), otherwise you may need to exclude 
other types in the future such as .css, .js, .png, .jpg, .jpeg, .gif, 
.ico, etc.

> 2) Create a Mason handler to handle the URL "robots.txt" and return the file 
> "as is"

        Be careful here -- you may need to return it with the correct 
content type of "text/plain" since "as-is" may be for raw file output 
(including HTTP headers).

Randolf Richardson - rand...@inter-corporate.com
Inter-Corporate Computer & Network Services, Inc.
Vancouver, British Columbia, Canada
http://www.inter-corporate.com


Reply via email to