Hi John,

On Aug 24, 2010, at 1:13 PM, John Knutson wrote:

> I'm having some odd problems with a single-threaded application that has a 
> single file open and is writing to multiple data sets.  Each data set has its 
> own file hid_t, even though the file is the same.  Are there any known 
> problems with doing this?

        Nope, this should work.

> What I'm seeing is:
> 1) hanging on reads.  Stack:
> [1] H5V_chunk_index
> [2] H5D_create_chunk_file_map_hyper
> [3] H5D_chunk_io_init
> [4] H5D_read
> [5] H5Dread
> 2) corruption of a data space (memory space, just before a write)
> 
> Chances are pretty good, I'd say, that I have another memory overwrite 
> somewhere in my code that's causing this, but I thought I'd throw the 
> question out there while I search.

        Nothing comes to mind for that stack or behavior.  Can you run with 
valgrind or some other memory debugger?

        Quincey


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.hdfgroup.org/mailman/listinfo/hdf-forum_hdfgroup.org

Reply via email to