Hi,

FYI I've added a robots.txt file in the root directory of lists.debian.org
HTMLs, containing:

User-agent: *
Disallow: *

Hopefully it will fix googlebot and other web crawlers that seem to have
noticed the lists archives and which tend to DoS the machine... :/

-- 
Digital Electronic Being Intended for Assassination and Nullification

Reply via email to