Our instrument control software uses hdf5 files to store neutron acquisition
data files.
When the size of the "data" group is growing, we have random compressions.
Sometimes the dataset is compressed, sometimes not. Here is the dump of two
files containing the same dataset but with different resulting compression:

Bad file :

HDF5 "000028.nxs" {
GROUP "/" {
   ATTRIBUTE "HDF5_Version" {
      DATATYPE  H5T_STRING {
            STRSIZE 5;
            STRPAD H5T_STR_NULLTERM;
            CSET H5T_CSET_ASCII;
            CTYPE H5T_C_S1;
         }
      DATASPACE  SCALAR
   }
   GROUP "entry0" {
      ATTRIBUTE "NX_class" {
         DATATYPE  H5T_STRING {
               STRSIZE 7;
               STRPAD H5T_STR_NULLTERM;
               CSET H5T_CSET_ASCII;
               CTYPE H5T_C_S1;
            }
         DATASPACE  SCALAR
      }
      GROUP "data" {
         ATTRIBUTE "NX_class" {
            DATATYPE  H5T_STRING {
                  STRSIZE 6;
                  STRPAD H5T_STR_NULLTERM;
                  CSET H5T_CSET_ASCII;
                  CTYPE H5T_C_S1;
               }
            DATASPACE  SCALAR
         }
         DATASET "data" {
            DATATYPE  H5T_STD_I32LE
            DATASPACE  SIMPLE { ( 384, 256, 1024 ) / ( 384, 256, 1024 ) }
            STORAGE_LAYOUT {
               CHUNKED ( 384, 256, 1024 )
               SIZE 402653184 (1.000:1 COMPRESSION)
             }
            FILTERS {
               COMPRESSION DEFLATE { LEVEL 6 }
            }
            FILLVALUE {
               FILL_TIME H5D_FILL_TIME_IFSET
               VALUE  0            
            }
            ALLOCATION_TIME {
               H5D_ALLOC_TIME_INCR
            }
            ATTRIBUTE "signal" {
               DATATYPE  H5T_STD_I32LE
               DATASPACE  SCALAR
            }
         }
      }

Correct file :

HDF5 "000029.nxs" {
GROUP "/" {
   ATTRIBUTE "HDF5_Version" {
      DATATYPE  H5T_STRING {
            STRSIZE 5;
            STRPAD H5T_STR_NULLTERM;
            CSET H5T_CSET_ASCII;
            CTYPE H5T_C_S1;
         }
      DATASPACE  SCALAR
   }
   GROUP "entry0" {
      ATTRIBUTE "NX_class" {
         DATATYPE  H5T_STRING {
               STRSIZE 7;
               STRPAD H5T_STR_NULLTERM;
               CSET H5T_CSET_ASCII;
               CTYPE H5T_C_S1;
            }
         DATASPACE  SCALAR
      }
      GROUP "data" {
         ATTRIBUTE "NX_class" {
            DATATYPE  H5T_STRING {
                  STRSIZE 6;
                  STRPAD H5T_STR_NULLTERM;
                  CSET H5T_CSET_ASCII;
                  CTYPE H5T_C_S1;
               }
            DATASPACE  SCALAR
         }
         DATASET "data" {
            DATATYPE  H5T_STD_I32LE
            DATASPACE  SIMPLE { ( 384, 256, 1024 ) / ( 384, 256, 1024 ) }
            STORAGE_LAYOUT {
               CHUNKED ( 384, 256, 1024 )
               SIZE 139221680 (2.892:1 COMPRESSION)
             }
            FILTERS {
               COMPRESSION DEFLATE { LEVEL 6 }
            }
            FILLVALUE {
               FILL_TIME H5D_FILL_TIME_IFSET
               VALUE  0            
            }
            ALLOCATION_TIME {
               H5D_ALLOC_TIME_INCR
            }
            ATTRIBUTE "signal" {
               DATATYPE  H5T_STD_I32LE
               DATASPACE  SCALAR
            }
         }
      }

compression type : NX_COMP_LZW
hdf5 version 1.8.3 called by the Nexus library 4.3.0

Are there explanations for such random behaviour? Some solutions?




--
View this message in context: 
http://hdf-forum.184993.n3.nabble.com/hdf5-compression-problem-tp4025575.html
Sent from the hdf-forum mailing list archive at Nabble.com.

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to