On 16/3/2026 09:32, Fred Wright wrote:

This approach to bot-blocking is pretty lame.  There's nothing to stop the bots from using User-Agent strings from current browsers.  What then, require everyone to use lynx?  IP-based filtering would be more robust and less user-unfriendly.
Blocking the IP ranges that the scraping comes from would block most of the internet, and almost certainly block more legitimate users than we do now. UA blocking is not a great solution but it works (by which I mean, the site is usable, which it would not be otherwise). The only real alternative is to deploy something like Anubis, which would also block legitimate robots.txt-respecting scrapers.

- Josh

Reply via email to