[Xmldatadumps-l] "Extracted page abstracts for Yahoo" for Wikidata

2014-07-06 Thread Amir Ladsgroup
Hello, Wikidata dumps (e.g this ) have an annoying plus one named Yahoo abstracts, It has more than 16 GBs (mainly because it's not zipped) and because content of Wikidata pages are saved in term of numbers and codes instead of wikitext (e.g. this

Re: [Xmldatadumps-l] Fwd: Proposal: Stop dumping inactive/closed wikis

2015-01-17 Thread Amir Ladsgroup
Talking about wasting resource another waste of resource Dumping summaries for Yahoo! (Really? Why the hell? Is it our job to dump 20 GBs every damn month for just wikidata for Yahoo! bots?) Please explain why Wikimedia Foundation should spend r

[Xmldatadumps-l] Two dumps are in progress at the same time for Wikidata

2015-02-07 Thread Amir Ladsgroup
http://dumps.wikimedia.org/wikidatawiki/20150207/ http://dumps.wikimedia.org/wikidatawiki/20150204/ What's wrong? -- Amir ___ Xmldatadumps-l mailing list Xmldatadumps-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l

Re: [Xmldatadumps-l] [Wikitech-l] Change for abstracts dumps, primarily for wikidata

2018-04-04 Thread Amir Ladsgroup
I love this change, thank you! On Wed, Apr 4, 2018 at 4:33 PM Ariel Glenn WMF wrote: > Those of you that rely on the abstracts dumps will have noticed that the > content for wikidata is pretty much useless. It doesn't look like a > summary of the page because main namespace articles on wikidata