A Tuesday 13 July 2010 17:06:03 John Knutson escrigué: > I'm trying to find the optimal compression for 2- and 3-dimensional > matrices of recorded data. These data sets contain data that doesn't > change much over time (and time is being used as the first axis). I > thought that by using shuffle, I might get better compression, but > instead the resulting files were larger than without shuffle.
In my experience, shuffle does generally help in reducing you compressed data sizes, except when it does not ;-) I mean, experimentation is the best way to check if shuffle is going to help you or not. > Is shuffle meant to work with compound types? Are there things I need > to be considering in the organization of the axes of the data set in > order to better encourage compression? Yes, shuffle is designed to work with compound types too. And it works at chunksize level, so depending on the shape of your chunk and how data changes on each dimension of this shape, that *could* have a measurable effect indeed. Out of curiosity, which is the size of your compound type and your chunk size? -- Francesc Alted _______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
