Unfortunately, there's not a lot we can do against bots hitting the wiki.

Esp. AI crawlers have become the #1 "users" of the wiki in the past few months and those don't stick to any rules you give them. They also use multiple IP addresses, so it feels a bit like a DDoS.

I don't think AI crawlers are a bad thing, but it really doesn't help if they bring down systems.

Wiki.js would be running Node.js on the server, so that would provide some extra performance.

An alternative would be to use Wiki.js for editing and then have a static site generator pick up the Markdown from the Github storage repo and generate a static copy every few hours.

The editing URL would then have to be made less visible, of course. Perhaps there's a way to also hide most of the content unless you are logged in, so that the URL is less attractive to bots.


Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Experts (#1, Feb 17 2026)
Python Projects, Coaching and Support ...    https://www.egenix.com/
Python Product Development ...        https://consulting.egenix.com/
________________________________________________________________________

::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               https://www.egenix.com/company/contact/
                     https://www.malemburg.com/

On 17.02.2026 20:00, Chris Angelico wrote:
On Wed, 18 Feb 2026 at 05:54, Marc-Andre Lemburg <[email protected]> wrote:
On 17.02.2026 18:59, Mats Wichmann wrote:
On 2/16/26 09:27, Marc-Andre Lemburg wrote:
I emailed with Jacob.

There are a few parts to this:

   * The wiki was getting a lot of load (see my email)
   * Jacob is currently pretty much the only admin the PSF has and so
     load is high on him as well
   * The wiki engine is ancient and cannot handle the load anymore
   * We will start looking into alternative backend systems and then
     migrate, e.g. Wiki.js or Bookstack
   * Until then, the wiki is run in static form

We do need to get the incoming URLs working again, though. This
should be possible using a redirect directive.

I asked him to update the header accordingly.
seems like some redirection has come back... yesterday, the pointer to
the wiki events page from the events category on Discuss didn't work,
now it does:

https://discuss.python.org/t/about-the-events-category/39334

Also, the banner now says "in the process of being archived" rather
than "archived", and notes "Edits are discouraged", as if they were
now possible.
Yes, we are chatting about ways forward.

It's likely that we'll move to a wiki.js system with git storage backup,
but nothing is decided yet.

Jacob suggested converting the wiki to a statically generated site based
on a repo, but that would basically turn it into yet another docs WG
website, which I would like to avoid.

The wiki nature should stay, IMO.

What do you think ?
I definitely think it should be a wiki. The barrier to entry for a
wiki is FAR lower than for a docs repo, even if we try to be very
welcoming of PRs on that repo.

But if MoinMoin needs to go away, then so be it. I wouldn't be averse
to changing software if it fixes the constant DOS issues. Though I
would also want to look into a fail2ban firewall rule or similar.

ChrisA
_______________________________________________
pydotorg-www mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3//lists/pydotorg-www.python.org
Member address: [email protected]

Reply via email to