Hi,

Googlebot and msnbot are supposed to obey robots.txt, but they are ignoring my 
robots.txt ( http://simpy.com/robots.txt ), that contains:

User-agent: *
Disallow: /simpy/

It's been more than 2 weeks since I've updated my robots.txt, yet I still see 
this from Googlebot and msnbot:

66.249.65.107 - - [26/Mar/2006:10:15:01 -0500] "GET 
/simpy/User.do?username=otis HTTP/1.1" 200 2678 "-" "Mozilla/5.0 (compatible; 
Googlebot/2.1; +http://www.google.com/bot.html)"


Does anyone know how long it takes those two to stop crawling pages that are 
supposed to disallowed for them?

Thanks,
Otis


_______________________________________________
Robots mailing list
[email protected]
http://www.mccmedia.com/mailman/listinfo/robots

Reply via email to