Hi Barbara,

On Feb 14, 2011, at 5:01 PM, Collignon, Barbara C. wrote:

> 
> I have a HDF5 file ~500MB that is opened for parallel access and independent 
> read (each process reads his own part of the file).
> 
> I thought the memory requirement would be 500MB/Number_of_processes...but I 
> have memory failures and I was wondering whether it could be much more than 
> that ?

        It should be in that range of memory, plus a little for buffering and 
caching.  Is the dataset chunked or contiguous?

        Quincey


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to