Hi!

On Wed, Mar 13, 2002 at 02:08:35PM -0500, Neil Gunton wrote:
> There are some User-Agents that keep hitting my site, and they're
> driving me up the wall. [...]  I have blocked these agents (using
> the BlockAgent script in the O'Reilly mod_perl book),

There is a more easier way, which doesn't need mod_perl. I would use
something like the following:

BrowserMatchNoCase "(PBrowse|[DPR]Surf15a)" is_a_bot
<Limit>
        [...]
        Deny from env=is_a_bot
</Limit>

> The requests come from a large number of IP addresses (though some
> IP's are used over a period of weeks), so blocking by IP is
> impractical.

Try to find out (using whois or nslookup), if the IP belongs to some
ISP. If yes, then complain to abuse@<isp>: This often helps.

If they're not belonging to ISPs, it sounds like they used the same
technic as used for DDoS attacks: Using root kits to spread the
clients over a big number of hosts. In this case the repsonsible
administrator will be glad, if you inform him about the compromised
systems.

That's at least my experience and solution with unfriendly crawlers
and skript kiddies. (Although I mainly had to fight search engines,
which were indexing pages, we had in robots.txt.)

            Regards, Axel Beckert
-- 
-------------------------------------------------------------
Axel Beckert      ecos electronic communication services gmbh
Internetconnect * Webserver/-design/-datenbanken * Consulting

Post:       Tulpenstrasse 5         D-55276 Dienheim b. Mainz
E-Mail:     [EMAIL PROTECTED]         Voice:    +49 6133 926530
WWW:        http://www.ecos.de/     Fax:      +49 6133 925152
-------------------------------------------------------------


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to