https://bugzilla.wikimedia.org/show_bug.cgi?id=54369

--- Comment #17 from Daniel Kinzler <daniel.kinz...@wikimedia.de> ---
Creating a dump for the 349 items i have in the db on my laptop takes about 2
seconds. These are not very large, but then, most items on wikidata are not
large either (while a few are very large).

 > time php repo/maintenance/dumpJson.php > /dev/null
  Processed 100 entities.
  Processed 200 entities.
  Processed 300 entities.
  Processed 349 entities.

  real    0m2.385s
  user    0m1.996s
  sys    0m0.088s

All data is available in the XML dumps, but we'd need two passes (for the first
pass, a dump of the property namespace would be sufficient). I don't currently
have a dump locally. 

The script would need quite a bit of refactoring to work based on XML dumps;
I'd like to avoid that if we are not sure this is necessary / desirable. 

I don't think  we currently have a good way to test with a large data set
ourselves at the moment. Importing XML dumps does not really work with wikibase
(this is an annoying issue, but not easy to fix).

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to