Hi,

[re-sent from right origin address]

> On 15. Mar 2026, at 23:32, Fred Wright <[email protected]> wrote:
> 
> This approach to bot-blocking is pretty lame.  There's nothing to stop the 
> bots from using User-Agent strings from current browsers.  What then, require 
> everyone to use lynx?  IP-based filtering would be more robust and less 
> user-unfriendly.

Please refrain from making statements about things where you lack the required 
data to determine their effectiveness.

Is the UA blocking lame? Absolutely. Should it really not work, and can bots 
trivially work around it? Yes. Do they? No. I don’t pretend to understand why.

IP-based filtering doesn’t work. I have a message pending in list moderation 
with a PDF attached that shows this, but the gist is this: We get about ~100 
likely bot requests per day across literally thousands of different IPs, and 
only a small subset of IP ranges sends more than ~250 requests. We need much 
higher numbers to start blocking based on IP, or you’ll just end up complaining 
that Trac locks you out after you’ve clicked 12 tickets.

So, it would be really nice if you didn’t try to devalue the work I’ve put in 
to keep Trac available by saying what I’ve done is stupid and wrong and you 
know better. This whole AI crawler situation isn’t exactly a motivating 
experience in the first place, and your feedback makes me just want to leave 
Trac broken for a few weeks instead.

In other news, I’ve added an exception for Firefox ESR 115, so if you use that 
or a derivative of it, you should still be able to access Trac. If you’re on 
something else, my patience for dealing with outdated operating systems and 
equally outdated browsers is at its end.


Clemens

Reply via email to