Anyone out there have a systematic method for identifying Robots hits on WWW
pages?

So far, all I go on is the User Agent string in the log files.

Any other suggestions are appreciated.






Reply via email to