I am a fan of the scale-offset filter followed by the gzip filter to
really reduce the size of big 3D datasets of weather model data. I am
using this compression strategy with HDF5 to do massively parallel
simulations and writing out one HDF5 file per MPI process.

I recently discovered when rendering data spanning multiple files that
there is a boundary issue as you hop from one dataset to the next.
There is a slight discontinuity in the uncompressed floating point
data between values as you go from one file to the next. I would
imagine this has to do with the internal parameters chosen by the
filter algorithm which must look for the maximum and minimum values in
the dataset being operated upon, which will vary from file to file
(from MPI proc. to MPI proc).

Is there some way to have the scale offset filter use global
parameters such that the discontinuities vanish? Before I used HDF5 I
used HDF4 and wrote my own scale/offset filter which used the global
max and min values (using a collective MPI call to determine this) and
this worked fine. However I like the transparency of the HDF5 filters
and would prefer to not write my own.

Any suggestions appreciated.

Thanks,

Leigh

-- 
Leigh Orf
Associate Professor of Atmospheric Science
Room 130G Engineering and Technology
Department of Geology and Meteorology
Central Michigan University
Mount Pleasant, MI 48859
(989)774-1923
Amateur radio callsign: KG4ULP

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to