Mitar added a comment.

  Thank you for redirecting me to this issue. As I mentioned in T278204 
<https://phabricator.wikimedia.org/T278204> my main motivation is in fact not 
downloading in parallel, but processing in parallel. Just decompressing that 
large file takes half a day on my machine. If I can instead use 12 machines on 
12 splits, for example, I can do that decompression (or some other processing) 
in one hour instead.

TASK DETAIL
  https://phabricator.wikimedia.org/T115223

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Mitar
Cc: Addshore, Mitar, abian, JanZerebecki, Hydriz, hoo, Halfak, NealMcB, 
Aklapper, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Wikidata-bugs, aude, Svick, Mbch331, jeremyb
_______________________________________________
Wikidata-bugs mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to