Quoted from Rory O'Farrell:
"So I see two problems -
one) the need to preserve the original file, which ought be easy as it is a
change in the logic of the Save process,
two) the need to investigate what code shortcoming leads to files of hashes,
which may involve detailed analysis of low level code (my thoughts are that it
may be caused by unmasked interrupts, but I haven't coded at that level for 30
years, so am out of my experience)."
I don't know if this hunch of mine makes any sense but I couldn't help
thinking of OOP in Turbo Pascal. I worked with file objects there with
a read/write buffer. All of the i/o would be in try ... except ...
finally blocks, and in the finally part any remaining data would be
written to the file, which would be closed afterwards and the object
would be destroyed. The point is, whenever a new object was initialised
with its constructor, all of its fields would be zeroed. If the same
thing is done in OOo, I can imagine that if files are written to disk
from an object, and power fails, maybe sometimes the zeroed buffer might
be written to the file instead of the file data.
Just my two cents.
Peter aka floris v