I am trying to get a handle on how much memory is being used by HDF5 for
caching, and have a couple questions:

Do the cache limits apply globally (per process), per file, per dataset, or
in some other way?  Specifically when trying to compute the total memory
usage I should just add the memory for the raw data chunk cache and the
metadata cache or do I need to multiply one or both by the number of
files/datasets/other?

Is there any good way to measure actual cache memory usage or am I limited
to using top to check process memory usage and computing values based on
cache parameters?

Thank You,
Ethan
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to