On Wed, 18 Feb 2026 at 06:13, Marc-Andre Lemburg <[email protected]> wrote:
>
> Unfortunately, there's not a lot we can do against bots hitting the wiki.

Yes, but also there ARE *some* things we could do.

> Esp. AI crawlers have become the #1 "users" of the wiki in the past few
> months and those don't stick to any rules you give them. They also use
> multiple IP addresses, so it feels a bit like a DDoS.
>
> I don't think AI crawlers are a bad thing, but it really doesn't help if
> they bring down systems.

Agreed, so that would mean we'd need to rate-limit those requests in
some way. I'm sure we're not the first to run into this problem,
surely. This has to be a known issue.

> Wiki.js would be running Node.js on the server, so that would provide
> some extra performance.
>
> An alternative would be to use Wiki.js for editing and then have a
> static site generator pick up the Markdown from the Github storage repo
> and generate a static copy every few hours.
>
> The editing URL would then have to be made less visible, of course.
> Perhaps there's a way to also hide most of the content unless you are
> logged in, so that the URL is less attractive to bots.

Yeah. If the editing URL just redirects you back to a login page, it
won't be any use to bots.

ChrisA
_______________________________________________
pydotorg-www mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3//lists/pydotorg-www.python.org
Member address: [email protected]

Reply via email to