As far as Analytics / Statistics are concerned, this is just an interesting
artifact. The problems we have to solve while reading and processing XML
dumps are quite different (custom file formats to handle files bigger than
a single HDFS block, etc). Safe to delete in my opinion, the less code
This script was created in 2011 and takes an offline XML dump file,
containing page content wikitext, and feeds its entries through the
Preprocessor without actually importing any content into the wiki.
The documented purpose of the script is to "get statistics" or "fill the
cache". I was unable
The 1.36.0-wmf.27 version of MediaWiki[0] is at group0[1], and is
currently blocked. Thanks to James Forrester, RhinosF1, abi_,
Nikerabbit, Urbanecm, Daimona, Ladsgroup, Marostegui, and probably
others I'm forgetting for getting us this far.
The new version cannot be deployed further until
https://www.mediawiki.org/wiki/Scrum_of_scrums/2021-01-20
2021-01-20
== Callouts ==
* Seats still available for the frontend web performance training. Doodle
seems to be favoring West coast times currently:
https://doodle.com/poll/ax9qak5mwb7rzvuh?utm_source=poll_medium=link
=== No updates ===