|
On Tue, 07 Dec 2010 21:37:51 +0100, Philip Winston <[email protected]> wrote:
You can always munmap something as well, which would be page-backing it. I've been facing similar issues, like scanning through 500GB HDF5 files with 32GB available RAM. That works ok with HDF5, but I've implemented my own memory management strategy to tell it which parts to keep in memory and which parts to unload (of course, using random access to the datasets). Turned out that the "remove the least-used-object" strategy is not necessarily the best one (as the OS would follow with pages), but some classification on similarity of objects that are kept or to be discarded from memory seems much more efficient. Werner -- ___________________________________________________________________________ Dr. Werner Benger Visualization Research Laboratory for Creative Arts and Technology (LCAT) Center for Computation & Technology at Louisiana State University (CCT/LSU) 211 Johnston Hall, Baton Rouge, Louisiana 70803 Tel.: +1 225 578 4809 Fax.: +1 225 578-5362 |
_______________________________________________ Hdf-forum is for HDF software users discussion. [email protected] http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org
