On 2013-07-11 14:43:21 +0100, Greg Stark wrote: > On Wed, Jul 10, 2013 at 9:36 AM, Magnus Hagander <mag...@hagander.net> wrote: > > We already run this, that's what we did to make it survive at all. The > > problem is there are so many thousands of different URLs you can get > > to on that site, and google indexes them all by default. > > There's also https://support.google.com/webmasters/answer/48620?hl=en > which lets us control how fast the Google crawler crawls. I think it's > adaptive though so if the pages are slow it should be crawling slowly
The problem is that gitweb gives you access to more than a million pages... Revisions: git rev-list --all origin/master|wc -l => 77123 Branches: git branch --all|grep origin|wc - Views per commit: commit, commitdiff, tree So, slow crawling isn't going to help very much. Greetings, Andres Freund -- Andres Freund http://www.2ndQuadrant.com/ PostgreSQL Development, 24x7 Support, Training & Services -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers