On Thu, Nov 09, 2000 at 10:43:19PM -0300, Nicolás Lichtmaier wrote: > > FYI I've added a robots.txt file in the root directory of lists.debian.org > > HTMLs, containing: > > > > User-agent: * > > Disallow: * > > > > Hopefully it will fix googlebot and other web crawlers that seem to have > > noticed the lists archives and which tend to DoS the machine... :/ > > That's very bad news, you are shutting down a useful service to our > users.. =/
It's either that or googlebot shuts master.d.o down. :| I'll have to check if it's the lists or the BTS that makes most load, though. -- Digital Electronic Being Intended for Assassination and Nullification

