On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
> I note that git.postgresql.org's robot.txt refuses permission to crawl
> the git repository:
> http://git.postgresql.org/robots.txt
> User-agent: *
> Disallow: /
> I'm curious what motivates this. It's certainly useful to be able to
> search for commits.

Gitweb is horribly slow. I don't think anybody with a bigger git repo
using gitweb can afford to let all the crawlers go through it.


Andres Freund

 Andres Freund                     http://www.2ndQuadrant.com/
 PostgreSQL Development, 24x7 Support, Training & Services

Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:

Reply via email to