Hannah_Bast added a comment.

  Can you or anyone else explain why the data dump takes so long, Lukas? One 
would expect that it is much easier to dump a (snapshot of a) dataset than to 
build a complex data structure from it. Also, dumping and compression are 
easily parallelized. And the pure volume isn't that large (< 100 GB compressed).

TASK DETAIL
  https://phabricator.wikimedia.org/T290839

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Hannah_Bast
Cc: Justin0x2004, Lucas_Werkmeister_WMDE, Bugreporter, Hannah_Bast, Aklapper, 
MPhamWMF, So9q, Invadibot, maantietaja, CBogen, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, 
Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331
_______________________________________________
Wikidata-bugs mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to