Hi Ethan,

On Jul 19, 2010, at 8:41 PM, Ethan Dreyfuss wrote:

> I am trying to get a handle on how much memory is being used by HDF5 for 
> caching, and have a couple questions:
> 
> Do the cache limits apply globally (per process), per file, per dataset, or 
> in some other way?  Specifically when trying to compute the total memory 
> usage I should just add the memory for the raw data chunk cache and the 
> metadata cache or do I need to multiply one or both by the number of 
> files/datasets/other?

        The metadata cache is per file and the raw data chunk cache is per 
dataset.

> Is there any good way to measure actual cache memory usage or am I limited to 
> using top to check process memory usage and computing values based on cache 
> parameters?

        Hmm, you can check the metadata cache, but I don't think there's a 
query function for the chunk cache currently.  Also, you can manually garbage 
collect the internal HDF5 library memory allocations with H5garbage_collect(), 
but we don't have a way to query that usage right now either.  Probably 
valgrind or top would still be reasonable now...

        Quincey



_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to