I have been using h5dump to help provide diagnostics on our data, and
have noticed behavior that I don't understand nor expect.
If I use a command like "h5dump -o somefile --dataset=somedataset -s
5630,0,0,0 -c 10,2,32,1 somefile.h5", the output is as I expect: a dump
of all elements from 5630,0,0,0 through 5639,1,31,0
However, if I change that command from -c to -k (changing the count to a
block size instead), I end up with a very strange set of blocks in no
apparent pattern. First it dumps 5630,0,0-19,0 then 5630,0,1-20,0 and
so on. I guess there is a pattern here, but it's a really odd one, and
h5dump never goes beyond 5630 in the first dimension.
I would have expected to get the exact same set of elements. Why am I
seeing what I'm seeing?
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org