On Tue, 2010-12-07 at 12:53 -0800, Werner Benger wrote:

> 
> 
> I've been facing similar issues, like scanning through 500GB HDF5
> files with 32GB available RAM.
> That works ok with HDF5, but I've implemented my own memory management
> strategy to tell
> it which parts to keep in memory and which parts to unload (of course,
> using random access to
> the datasets). Turned out that the "remove the least-used-object"
> strategy is not necessarily
> the best one (as the OS would follow with pages), but some
> classification on similarity of objects
> that are kept or to be discarded from memory seems much more
> efficient.


Out of curiosity, did you consider/need to use
posix_fadvise/posix_madvise to give the OS a hint that the application
is managing the memory/caching and so the OS should NOT attempt to?
(another way of doing this I guess is using 'direct I/O' -- O_DIRECT,
though that is not available very many places).

Mark



-- 
Mark C. Miller, Lawrence Livermore National Laboratory
================!!LLNL BUSINESS ONLY!!================
[email protected]      urgent: [email protected]
T:8-6 (925)-423-5901    M/W/Th:7-12,2-7 (530)-753-8511


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to