hoo added a comment.

Given the length of time Wikidata weekly dumps take to run, do we still want to do this? What sort of cpu/memory requirements will it have, compared to gzip?

Memory will probably be the same (gzip uses tiny tiny compression windows… for today's standards at least). CPU will probably be up quite a bit (I would expect this to rise the entire need for the job by at least 10%).

Do we still want this? I'm not sure… but looking into saving space/ bandwith for the gzip dumps sounds sensible to me (but I can't say whether we have the resources for this at hands).


TASK DETAIL
https://phabricator.wikimedia.org/T151876

EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: hoo
Cc: ArielGlenn, Aklapper, hoo, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Svick, Mbch331, jeremyb
_______________________________________________
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to