>> >I have heard of an algorithm of 10 requests within 10 minutes from the
>> >same IP being used to flag potential spiders. However, that still
>> >leaves potential proxy server requests.
>>
Considering that spiders are always changing and mutating, I think the best way
to keep track of them is to join efforts in constructing a spider tracking site
with all their names and user agent strings, coupled with verifications by
users. php3 anyone?
NB: 10 req in 10 mins from same IP could be a web whacker, or even any kind of
one off script. We get them here from time to time. Some very badly written and
that clog up our sites when they find cgis.
Ale
------------------------------------------------------------------------
This is the analog-help mailing list. To unsubscribe from this
mailing list, send mail to [EMAIL PROTECTED]
with "unsubscribe analog-help" in the main BODY OF THE MESSAGE.
List archived at http://www.mail-archive.com/[email protected]/
------------------------------------------------------------------------