Forwarding in case this is of interest to anyone on the Analytics or
Research lists who doesn't subscribe to Wikitech-l or Xmldatadumps-l.

Pine
( https://meta.wikimedia.org/wiki/User:Pine )

---------- Forwarded message ----------
From: Ariel Glenn WMF <[email protected]>
Date: Fri, Jul 20, 2018 at 5:53 AM
Subject: [Wikitech-l] hewiki dump to be added to 'big wikis' and run with
multiple processes
To: Wikipedia Xmldatadumps-l <[email protected]>,
Wikimedia developers <[email protected]>


Good morning!

The pages-meta-history dumps for hewiki take 70 hours these days, the
longest of any wiki not already running with parallel jobs. I plan to add
it to the list of 'big wikis' starting August 1st, meaning that 6 jobs will
run in parallel producing the usual numbered file output; look at e.g.
frwiki dumps for an example.

Please adjust any download/processing scripts accordingly.

Thanks!

Ariel
_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
_______________________________________________
Analytics mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to