In the past couple of days, I've been having problems with spiders [...]
You could use a robots.txt to guard against those spiders that behave well. If the misbehaving spiders use a certain distinguishable User-Agent header, you could block that.
-- Gerhard
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]