I am reading a large number (186) of HDF5 compressed datafiles. Eventually the allocated memory exceeds 20g and the program fails.

I've traced the increasing memory allocation to the H5Z_filter_deflate() function in H5Zdeflate.c, specifically the lines:

       nalloc *= 2;
       if(NULL == (new_outbuf = H5MM_realloc(outbuf, nalloc))) {
           (void)inflateEnd(&z_strm);
           HGOTO_ERROR(H5E_RESOURCE, H5E_NOSPACE, 0, "memory allocation failed for 
deflate uncompression")

It appears as if this memory is never freed causing the memory leak.

Is this the desired behaviour or could there be a bug that prevents this
memory from being freed?


=========================
Joel Gales
Futuretech Corp.

SIMBIOS Code 616.0
Phone: (301) 286-1403
FAX:   (301) 286-0268

Bin Globally  Map Locally

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Reply via email to