Hannah_Bast added a comment.

  > Wikibase doesn’t store data in RDF, so dumping the data set means parsing 
the native representation (JSON) and writing it out again as RDF, including 
some metadata for each page.
  
  That is what I expected. But converting less than 100 GB of data from one 
format to another should not take that long. Which  software are you using for 
the conversion? For example, Apache Jena has all kinds of tools to convert to 
and from the various RDF formats. It works, but it's incredibly slow (ten times 
slower than more efficient converters).

TASK DETAIL
  https://phabricator.wikimedia.org/T290839

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Hannah_Bast
Cc: Addshore, Justin0x2004, Lucas_Werkmeister_WMDE, Bugreporter, Hannah_Bast, 
Aklapper, MPhamWMF, So9q, Invadibot, maantietaja, CBogen, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, 
Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331
_______________________________________________
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org

Reply via email to