Hey,

I got some data to share. Walking through the dump for the first 1000
entities (~19Mb), took 0.008 seconds per item, where in each step the
following things where done:

* read line from file
* json_decode the line
* use the EntityDeserializer to turn the array into DataModel objects

Given that these entities are on average a lot bigger than the typical one
found in Wikidata, it looks like the average deserialization time is a few
milliseconds. So now I really wonder why people are blaming DataModel 1.0.
Everything seems to indicate most time is spend in Wikibase.git and
MediaWiki.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
_______________________________________________
Wikidata-tech mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech

Reply via email to