> Anyone out there have a systematic method for identifying Robots hits on WWW
> pages?
>
> So far, all I go on is the User Agent string in the log files.

I wrote something many, many moons ago called BotWatch. This used a variety
of approaches, including the presence of robots.txt fetches, originating URLs
and User-Agent strings to attempt to identify robots.

You can find it at http://www.tardis.ed.ac.uk/home/sxw/robots/botwatch.html

The configuration file hasn't been updated in a long time. Your mileage may
vary.

Cheers,

Simon.

Reply via email to