"h5dump -p" gives you a per dataset storage_layout information which contains the SIZE and OFFSET of the dataset. I always use it with "-H" command so that it just prints the header of the HDF5 file. For example:

h5dump -pH sample_dataset.h5

Hope this helps,
Babak

On 11/05/2013 08:31 AM, Konrad Hinsen wrote:
Hi everyone,

I just spent some time looking for a command-line tool that shows the
size occupied by each dataset in a file. I didn't find anything. The
most promising candidates were h5stat, h5ls, and h5dump, but it seems
that none of them can provide the information I am looking for.

Is there perhaps a third-party tool for that purpose?

I realize that "size" can be defined in lots of ways, but I don't
really care about the details. I have lots of files that each contain
hundreds of datasets, of which most are small but a few are very big.
I am looking for a simple way to identify the big ones. My ideal
definition of size is "how much smaller would the file be if dataset X
were not in there".

Konrad.


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Reply via email to