Hi, A user recently contributed a patch adding support for the scale-offset filter in h5py, and we are seeing some odd behavior which seems to be related to data caching. When the filter is set up for lossy encoding (e.g. storing 32-bit ints with 1 bit of precision), when a small dataset is written, subsequent reads produce to the original, non-compressed data. Closing and reopening the file, or writing larger datasets, seems to produce the expected lossily-compressed data.
Is this expected behavior? Is there any way to get HDF5 not to cache chunks when a lossy filter is used, or, preferably, to only cache chunks after the transformation has been applied? Thanks, Andrew Collette _______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
