In the past couple of days, I've been having problems with spiders vandalizing the Wiki at http://www.sqlite.org/cvstrac/wiki. The damage (so far) has been relatively minor and easy to fix. But I've been monitoring these spiders for a while and notice that they are becoming increasingly aggressive.
If you have any suggestions on what to do about them, I'd like to hear from you.
My suggestion is to use a trap. A robots.txt guarded area that users won't click on but occurs early at the top of the main index page. If any IP visits that page install a firewall or httpd.conf block of some description.
Beyond that, ensuring that to change a page requires a secure hash auth system (i.e. can't submit earlier than 5 seconds after downloading the edit page, and can't submit later than N minutes) works very well.
Matt.
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]