Dear All
Some months ago I used h5py library to write compressed h5 file (using native 
gzip) and Scilab to read it; it worked fine and allowed to deal with several Go 
of data.
Now I would like to write/read huge amount of data using mainly Scilab 
(typically after several csvRead), but I'm wondering if it's to optimize h5 
file size using gzip capability?
HDFview can confirm that using h5write the data are not compressed, and I do 
not see such feature in the help doc.
Thanks for a feedback
Paul

_______________________________________________
users mailing list
[email protected]
http://lists.scilab.org/mailman/listinfo/users

Reply via email to