This is decidedly off-topic....

We run a pretty small website (multi-use) on Apache (2.2) and mod_perl
(along with some php, cgi, and static content).  Unfortunately, our
organization has recently decided to institute the policy of scanning
the site on a regular basis for security reasons.  The scan software
crawls all links and URLs on the site, hitting each one with multiple
forms of attack.  In some parts of the world, this is called a
denial-of-service attack, but here it is called a security scan.  I
have no control over the scan parameters, so I am looking for a
meaningful way of limiting the number of connections (not really
bandwidth, since we host VERY large static files) from a single IP.
Any suggestions?

Thanks,
Sean

Reply via email to