On Sat, Nov 11, 2000 at 09:54:21PM -0300, Nicolás Lichtmaier wrote: > > > > FYI I've added a robots.txt file in the root directory of > > > > lists.debian.org > > > > HTMLs, containing: > > > > > > > > User-agent: * > > > > Disallow: * > > > > > > > > Hopefully it will fix googlebot and other web crawlers that seem to have > > > > noticed the lists archives and which tend to DoS the machine... :/ > > > > > > That's very bad news, you are shutting down a useful service to our > > > users.. =/ > > It's either that or googlebot shuts master.d.o down. :| > > > > I'll have to check if it's the lists or the BTS that makes most load, > > though. > > Perhaps the Google people should be contacted.... >
I already did that. I'm waiting for the reply. Bye Cesar Mendoza http://www.kitiara.org -- "The three golden rules to ensure computer security: Do not own a computer, do not power it on, and do not use it." --Robert T. Morris

