I am not sure this is related, but I have not been able to load the 
"Visualization version" link to theorems created in 2024 or later, and 
possibly earlier as well. One example is the theorem norcom. I get a "404 
File not found" error message. I like the "Visualization version" as it 
provides the capability to copy and paste formulas directly into mmj2.

On Wednesday, November 19, 2025 at 2:14:22 AM UTC-8 [email protected] wrote:

On Mon, Nov 17, 2025 at 11:08 AM [email protected] <
[email protected]> wrote:

"It was restarted 3 minutes ago"

Wow, this finally fixed '?' references in PDF files (example 
<https://us.metamath.org/downloads/finiteaxiom.pdf>) which should've been 
fixed <https://github.com/metamath/metamath-website-scripts/issues/19> in 
June 2024. (archive from August 2025 
<https://web.archive.org/web/20250801041609/https://us.metamath.org/downloads/finiteaxiom.pdf>
)
It seems that nobody bothered to rebuild it since then, despite my comment 
in the pull request over one year ago.


I think this is because while metamath-website-seed is re-pulled every day 
as part of the site rebuild, metamath-website-scripts is only re-pulled 
when the linode machine is rebooted, which has basically happened only 
once(?), just now, since we started using this workflow after Norm passed 
in 2021. So we've been using an old script since then... Ideally the daily 
script would also replace itself with the latest script from 
metamath-website-scripts to avoid this issue.

On Mon, Nov 17, 2025 at 5:15 PM 'David A. Wheeler' via Metamath <
[email protected]> wrote: 

Regenerating the website takes more time than GitHub Actions will provide 
for free.


I think this is not necessarily the case. The mm-web-rs website generator 
is way way faster than metamath-exe, I expect the full website generation 
to take 5 minutes or less if we actually use it.

On Mon, Nov 17, 2025 at 5:15 PM 'David A. Wheeler' via Metamath <
[email protected]> wrote:

The website is regenerated daily. This is a *good* thing, because serving 
static pages to many systems is efficient, and all websites are now being 
hit by constant downloads (organizations are constantly re-downloading the 
Internet to feed to machine learning training processes).


This is true though; the web has changed significantly since I started 
thinking about this plan since AI bots are continuously harassing 
webmasters globally. But I think this is not anything that can't be solved 
with a caching layer, which we would have regardless (indeed we already 
have one as part of our nginx deployment). This is only an improvement in 
total processing power if it's not the case that the entire website (every 
single page) is requested every 24 hours. Right now it's possible that the 
entire site is being downloaded every day, but if we add things that make 
the site theoretically infinite then this won't be the case anymore. It 
comes down to a question of how big the cache is vs how much it costs to 
generate a page. I think it's difficult to know for sure without trying it.

-- 
You received this message because you are subscribed to the Google Groups 
"Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/metamath/10f3ea21-fa9b-4c30-a356-b6c6f8c28fe5n%40googlegroups.com.

Reply via email to