| thiemowmde added a comment. |
Are the snak hashes calculated at dump time or is this just another static field to be dumped?
Hashes are not meant to be stored in the database, but calculated every time they are needed.
What other fields are under consideration?
I assume this refers to secondary values, e.g. normalized quantity values (inches normalized to meter and such), full URIs for external identifiers, and such. These should not be included in a minimal dump, but might be included in an expanded dump.
How much longer would it take to do these runs?
Runtime is not that much of a problem, as far as I'm aware of.
How much bigger would they be for downloaders?
That's a good question. I assume it might be something between 1% and 10%, possibly. The hashes we are talking about here are mostly SHA1 hashes, in 40 characters hexadecimal form. These are not that well compressible.
Cc: Lydia_Pintscher, aude, WMDE-leszek, thiemowmde, ArielGlenn, hoo, daniel, Addshore, Lucas_Werkmeister_WMDE, Aklapper, GoranSMilovanovic, Soteriaspace, JakeTheDeveloper, QZanden, Zoranzoki21, Izno, Wikidata-bugs, TheDJ, Mbch331
_______________________________________________ Wikidata-bugs mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
