On Tue, May 10, 2011 at 02:02:11PM -0400, Ricky Elrod wrote: > Wanted to do this before freeze, but never had a chance -- let's get > robots.txt > working on hosted1 and try to bring down cpu load and improve page load times > a > bit. > > Thoughts/+1's please. robots.txt is already there and has been for a long > time, > but nothing has told apache to use it, because apache requests go straight to > trac. > > robots.txt is set to block crawlers from accessing fh.o/*/browser/* (which is > the trac source code browser) -- as per https://fedorahosted.org/ > fedora-infrastructure/ticket/1848 > +1
The web interfaces at git.fedorahosted.org/git/, bzr.fp.o/bzr, and hg.fp.o/hg take care of this I think. The one question I have in regards to this is do we have an svn web viewer? -Toshio > > > diff --git a/configs/web/fedorahosted.org.conf b/configs/web/ > fedorahosted.org.conf > index d0f7139..8f1f2e1 100644 > --- a/configs/web/fedorahosted.org.conf > +++ b/configs/web/fedorahosted.org.conf > @@ -2,6 +2,9 @@ > ServerName fedorahosted.org > ServerAlias www.fedorahosted.org > > + # Make robots.txt be used. > + Alias /robots.txt /srv/web/fedorahosted.org/robots.txt > + > Redirect 301 / https://fedorahosted.org/ > </VirtualHost> > > _______________________________________________ > infrastructure mailing list > [email protected] > https://admin.fedoraproject.org/mailman/listinfo/infrastructure
pgpK5AYpq1LEz.pgp
Description: PGP signature
_______________________________________________ infrastructure mailing list [email protected] https://admin.fedoraproject.org/mailman/listinfo/infrastructure
