I have an HDF5 data file that has a single data set in it call "phi" which has 
dimensions 350x350x450 and is 32 bit float values. I am trying to use H5Dump to 
dump the data as a binary file. 

507:[mjackson@Mine:MURI_Dendrite_Dataset]$ h5dump  -d /H -o H.bin 
dendrite_0008.h5 
HDF5 "dendrite_0008.h5" {
DATASET "/H" {
   DATATYPE  H5T_IEEE_F32LE
   DATASPACE  SIMPLE { ( 350, 350, 450 ) / ( 350, 350, 450 ) }
   DATA {
   }
}
}

The resulting file size is 625,492,563 bytes which is way too large. It should 
be 220,500,000 bytes. I am sure this is a "user error" but I can not see what i 
am doing wrong. Any help would be great. The original file size is about 220MB 
(which seems correct).

Thanks
Mike Jackson


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Reply via email to