I just looked at your file.

    You have the wildcard at the top.  You need to move 'User-agent: *,
Disallow: /simpy/' to the end of the file.  It should be the very last
entry.

    What happens is that Googlebot gets to the * and accepts the
instructions there.  It never gets to its own individual entry.



                                                                    Fred




                                                                        Fred
----- Original Message -----
From: <[EMAIL PROTECTED]>
To: <robots@MCCMEDIA.COM>
Sent: Sunday, March 26, 2006 10:25 AM
Subject: [Robots] Googlebot, msnbot, and robots.txt refresh


> Hi,
>
> Googlebot and msnbot are supposed to obey robots.txt, but they are
ignoring my robots.txt ( http://simpy.com/robots.txt ), that contains:
>
> User-agent: *
> Disallow: /simpy/
>
> It's been more than 2 weeks since I've updated my robots.txt, yet I still
see this from Googlebot and msnbot:
>
> 66.249.65.107 - - [26/Mar/2006:10:15:01 -0500] "GET
/simpy/User.do?username=otis HTTP/1.1" 200 2678 "-" "Mozilla/5.0
(compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
>
>
> Does anyone know how long it takes those two to stop crawling pages that
are supposed to disallowed for them?
>
> Thanks,
> Otis
>
>
> _______________________________________________
> Robots mailing list
> Robots@mccmedia.com
> http://www.mccmedia.com/mailman/listinfo/robots

_______________________________________________
Robots mailing list
Robots@mccmedia.com
http://www.mccmedia.com/mailman/listinfo/robots

Reply via email to