On Fri, 11 Jun 2004 10:51:43 -0500 "Chris Thielen" <[EMAIL PROTECTED]> wrote:
> I use wpoison on my system, and have implemented a bot trap like > described above. I have my wpoison CGI in robots.txt so normal spiders > shouldn't be affected... I wrote a small filter which watches for > wpoison hits, and after a few hits, returns 404 for *any* url requested > from that IP address. Since I noticed that there are frequently hits > from multiple IP Addresses accessing links using the same JSESSIONID > (appended to links on the front page; generated by the web server), I > also group any other IPs using the same session ID into the ban. This > effectively stops a botnet from retrieving the front page, then having > each client spider a different branch because if one client spiders the > wpoison branch, they all get banned. sounds like a great idea, i do a similar trick by monitoring access to ports on my server, once a connection is made (to port 22 or whatever) then the report IP address gets permanently banned from the firewall. Its great fun watching kids trying exploits or rootkits on the system :]
