Someone else just discovered a mess that I've made in our data, where I was using double (64-bit IEEE floating point) for in-memory storage for certain fields in compound data types/structs, but on disk was using IEEE-NATIVE-FLOAT (i.e. 32-bit).

To be more specific, I had a C++ type double that was being mapped to HDF type H5T_NATIVE_FLOAT rather than DOUBLE. The end result hasn't been pretty - the files are usable only in the code that retained that incorrect mapping.

Since repairing the type mapping to double->NATIVE_DOUBLE, the previously written data files have invalid values for those fields. I was kind of hoping the internal HDF5 type mapping would resolve the translation correctly, but it hasn't.

I'm assuming at this point I'm going to have to rewrite these old files with the bad mapping in them. Does anyone have any suggestions as to how best accomplish this task? The original files contain:

1 dataset with the dodgy type mappings
1 dataset consisting of references to the above (essentially the same data but from a different indexing)
several dimension scales for the above two datasets
a slew of datatypes

What would have to change are the datatypes and the one dataset with the dodgy type mappings. Is there a quick and easy way to create a new file with the change in types (I can back-fill the correct values for the broken fields later) or do I just need to rewrite the whole thing from scratch?


_______________________________________________
Hdf-forum is for HDF software users discussion.
Hdf-forum@hdfgroup.org
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to