Hi,
I'm reading a file of size ~83MB consisting out of 6294 datasets,
converting single precision data in the file to double precision in
memory, so the memory occupancy would be about 145MB of all data loaded
itself. However, once reading a dataset with H5Dread() I leave the
data space identifier open, for further usage in case the same dataset
needs to be read again later. This leads to some memory overhead of
388MB, which is twice as large as the data that is actually used in
memory. When doing an H5Dclose() on the dataset identifier just after
the H5Dread() call, this memory overhead does not appear.
Is there a way to free this evidently HDF5-internal memory that is
used when reading a dataset? I'd like to keep the dataset identifier
available for further usage, but this memory overhead per read is
killing memory performance. Calling
H5garbage_collect(void)
doesn't help. I'm using HDF5 version 1.8.4-snap17 , was there any
change on such memory management behavior in more recent versions?
Werner
--
___________________________________________________________________________
Dr. Werner Benger Visualization Research
Laboratory for Creative Arts and Technology (LCAT)
Center for Computation & Technology at Louisiana State University (CCT/LSU)
211 Johnston Hall, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809 Fax.: +1 225 578-5362
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org