Hi,

I have a 4GB binary dump of data that I'd like to store as a hdf5 dataset
(using command line tools if possible).  The data set will have the
dimensions 31486448 x 128.  I believe this is too big to import as a data
set in one go.

Running h5import gives the following error:
Unable to allocate dynamic memory.
Error in allocating unsigned integer data storage.
Error in reading the input file: my_data
Program aborted.

So I split the binary dump into four files which can be imported.  I'd still
like to have one 31486448 x 128 dataset but am not sure that's possible to
do.

Any idea how I could combine these four binary dumps into one data set.
Maybe create a single dataset and append each small one...?

Thanks,

Ryan
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to