I got my data in separate files numbered in hundreds, say file_001, file_002,
to file_00N. Each file contains an array of vector of 128 dimension, but
these are generated from Matlab's dlmwrite function which writes files as
text, not in binary. What I need is I need to put all of these vectors from
all files into one single hdf5 file. So in the end, this hdf5 file will have
a millions of 128 dimension vector in one single dataset.

I have tried looking around how to do this, and I found
http://www.hdfgroup.org/HDF5/doc/RM/Tools.html#Tools-Import
 
>From h5import, I read that I need my data to be in binary first, and I could
use h5dump to do that. However, I tried re-read the documents a few times, I
got confused, espcially on the configuration file part.

So if I just want to achieve the above, how can I do that? Or must I use
some other tools?



--
View this message in context: 
http://hdf-forum.184993.n3.nabble.com/Converting-a-data-from-a-list-of-files-into-a-single-hdf5-file-tp4025768.html
Sent from the hdf-forum mailing list archive at Nabble.com.

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to