Wanted to do this before freeze, but never had a chance -- let's get robots.txt working on hosted1 and try to bring down cpu load and improve page load times a bit.
Thoughts/+1's please. robots.txt is already there and has been for a long time, but nothing has told apache to use it, because apache requests go straight to trac. robots.txt is set to block crawlers from accessing fh.o/*/browser/* (which is the trac source code browser) -- as per https://fedorahosted.org/fedora-infrastructure/ticket/1848 diff --git a/configs/web/fedorahosted.org.conf b/configs/web/fedorahosted.org.conf index d0f7139..8f1f2e1 100644 --- a/configs/web/fedorahosted.org.conf +++ b/configs/web/fedorahosted.org.conf @@ -2,6 +2,9 @@ ServerName fedorahosted.org ServerAlias www.fedorahosted.org + # Make robots.txt be used. + Alias /robots.txt /srv/web/fedorahosted.org/robots.txt + Redirect 301 / https://fedorahosted.org/ </VirtualHost>
_______________________________________________ infrastructure mailing list [email protected] https://admin.fedoraproject.org/mailman/listinfo/infrastructure
