Mostly I was looking for comments or feedback on whether DROPing
    packets from crawlers is a good or bad thing to do.

Either DROP or REJECT, take your choice. f2b usually does reject,
I believe. I see pros and cons either way ...

    problem has been lack of appropriate robots.txt files.  

My tug.org robots.txt does seem to stop a surprising majority of
undesired crawling. Of course you could grab it helpful.

    User-agent: *
    Disallow: /

Ack :).

Happy filtering,
Karl

Reply via email to