Andrawaag added a comment.

  I wasn't looking for guarantees about the hash values. They have a value a 
sanity check in a[[ https://github.com/Wikidata/triplify-json |  reverse 
engineering project ]] we are doing to reproduce the Wikidata/Wikibase RDF 
outside wikibase itself.  We need this to be able to apply EntitySchema's 
pre-ingestion. Currently, EntitySchema's can only be applied post data 
ingestion. The script, as it currently works, builds the RDF from the JSON. 
That JSON object is enriched and the idea is to then verify if the new JSON 
object still fits the EntitySchema before it is submitted to the API of 
Wikidata.
  
  In building that RDF script the hash values have a role in verifying the 
(reversed engineered) script does indeed reflect the exact same RDF as it is 
produced natively by Wikidata. For most snaks the hash values are given in the 
JSON that is produced by the API of wikidata.
  
  This is not the case for those values in the RDF that are not given by the 
JSON export. This is specifically for the normalized values on time and globe 
coordinates. That is why I am interested in the algorithm that is used to 
produce those hash values internally.

TASK DETAIL
  https://phabricator.wikimedia.org/T283997

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Addshore, Andrawaag
Cc: Lucas_Werkmeister_WMDE, Aklapper, Andrawaag, Invadibot, maantietaja, 
Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, 
_jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Addshore, Mbch331
_______________________________________________
Wikidata-bugs mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to