> > > FYI I've added a robots.txt file in the root directory of lists.debian.org > > > HTMLs, containing: > > > > > > User-agent: * > > > Disallow: * > > > > > > Hopefully it will fix googlebot and other web crawlers that seem to have > > > noticed the lists archives and which tend to DoS the machine... :/ > > > > That's very bad news, you are shutting down a useful service to our > > users.. =/ > It's either that or googlebot shuts master.d.o down. :| > > I'll have to check if it's the lists or the BTS that makes most load, > though.
Perhaps the Google people should be contacted....

