It would be nice if you can run your program with a large array and get
some performance results.
Since you are not writing data to disk at all, I expect that the your
read/write speed could as high
as 1GB/s instead of some disk I/O speed, e.g. 100MB/s. I wonder if JVM
will create a bottleneck.
Thanks
--pc
Nigel Pickard wrote:
Here's some sample code to hopefully help anybody else who might be
struggling with this topic. Basically a car object
with a string model description, int year and double array of top
speed values is put into memory as HDF5 format, and
subsequenty retrieved.
Since the model and year member variables are small, I used attributes
to describe this data in the HDF5 format. And even
though the top speeds only has 4 entries, I assume that this data
might be large in practice so used a data set for it.
Since I wanted to just get this up and running, I freely used
HDF5Constants.H5P_DEFAULT whenever I was in doubt, but
appreciate it may not be the best thing, particularly since I did not
fully understand all implications. If any one sees anything
terribly wrong, please let me know, thanks.
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org