I'm having some odd problems with a single-threaded application that has a single file open and is writing to multiple data sets. Each data set has its own file hid_t, even though the file is the same. Are there any known problems with doing this?

What I'm seeing is:
1) hanging on reads.  Stack:
 [1] H5V_chunk_index
 [2] H5D_create_chunk_file_map_hyper
 [3] H5D_chunk_io_init
 [4] H5D_read
 [5] H5Dread
2) corruption of a data space (memory space, just before a write)

Chances are pretty good, I'd say, that I have another memory overwrite somewhere in my code that's causing this, but I thought I'd throw the question out there while I search.


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to