Nathaniel,

>>
Even though we have constructed our methods to fit climate data, the
features we exploited for compression are very general and likely to
apply to other scientific data as well. This is why we are confident
that many of you could profit from these methods as well, and we would
be happy to share our results with the rest of the community.
<<

Can you supply a little more information about what the data is and what the 
compression is doing that's special?

Is it ECHAM data for example? Are you compressing a single field array, or a 
whole collection of arrays, individually or combined into a larger 
multidimensional dataset. Are they high variance datasets of slowly varying 
fields?

Did you apply any other filters such as the shuffle filter before compressing, 
and is the compression lossless (assume yes).
If you switched to a lossy compression, by first packing the data into a range 
(scale-offset filter?), how would that affect the final compression? Have you 
looked into this?

Are there more details of this kind available? if so, may I see them please. We 
have users with excessive data storage needs and any progress towards reducing 
it is welcome.

thanks

JB.
PS. others suggested sites to host code and I can recommend our hpc forge site 
where our VFD is hosted
https://hpcforge.org/projects/h5fddsm/
if this is of interest, I'd be happy to setup things for you.
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to