Hi Mohamad Chaarawi,

I inserted the code in the following program :

http://beige.ucs.indiana.edu/I590/node88.html

before closing closing the file and ran. I got no errors.

Here is how I ran the program :
mpicc -DDEBUG -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -o mkrandpfile 
mkrandpfile.c
mpiexec -n 8 mkrandpfile -f test -l 20

With kind regards,
Imran

29. apr. 2014 kl. 17:06 skrev Mohamad Chaarawi <[email protected]>:

> Hi Imran,
>  
> Does the filesystem and MPI version you are using support MPI atomic mode?
> You can check that by trying any simple MPI IO program and add
> MPI_File_set_atomicity(fh, 1);
> after opening the file.
>  
> This is what this test is trying. Granted this is not a very well supported 
> feature in terms of performance, so if this test fails, you can ignore it as 
> long as you do not use H5Fset_mpi_atomicity().
> But if you are using this routine (as this test is doing) and your filesystem 
> does not support MPI file atomicity, then you will see that failure.
>  
> Thanks,
> Mohamad
>  
> From: Hdf-forum [mailto:[email protected]] On Behalf Of 
> Imran Ali
> Sent: Monday, April 28, 2014 12:08 PM
> To: [email protected]
> Subject: [Hdf-forum] IOError: Unable to create file (Mpi_err_other: known 
> error not in list)
>  
> I have recently opened an issue at h5py github repository 
> (https://github.com/h5py/h5py/issues/434#issuecomment-41567806) about an 
> error I am having with my Parallel HDF5 install. I have installed HDF5 
> through the dorsal script 
> (https://github.com/FEniCS/dorsal/blob/master/FEniCS/packages/hdf5.package). 
> The install completes without any error. However, when I try to use h5py 
> (using a test run script) , I get an odd error :
>  
> ................EE................................................................................................................sss...........................x.......................s.......................................x.................................s...............................................
> ======================================================================
> ERROR: test_mpi_atomic (h5py.tests.test_file.TestDrivers)
> Enable atomic mode for MPIO driver
> 
> Traceback (most recent call last):
> File 
> "/uio/hume/student-u29/imranal/Work/FEniCS/lib/python2.7/site-packages/h5py-2.3.0-py2.7-
>  linux-x86_64.egg/h5py/tests/test_file.py", line 251, in test_mpi_atomic
> with File(fname, 'w', driver='mpio', comm=MPI.COMM_WORLD) as f:
> File 
> "/uio/hume/student-u29/imranal/Work/FEniCS/lib/python2.7/site-packages/h5py-2.3.0-py2.7-linux-x86_64.egg/h5py/hl/files.py",
>  line 222, in __init_
> fid = make_fid(name, mode, userblock_size, fapl)
> File 
> "/uio/hume/student-u29/imranal/Work/FEniCS/lib/python2.7/site-packages/h5py-2.3.0-py2.7-linux-x86_64.egg/h5py/_hl/files.py",
>  line 85, in make_fid
> fid = h5f.create(name, h5f.ACC_TRUNC, fapl=fapl, fcpl=fcpl)
> File "h5f.pyx", line 90, in h5py.h5f.create (h5py/h5f.c:2222)
> IOError: Unable to create file (Mpi_err_other: known error not in list)
> 
> ======================================================================
> ERROR: test_mpio (h5py.tests.test_file.TestDrivers)
> MPIO driver and options
> 
> Traceback (most recent call last):
> File 
> "/uio/hume/student-u29/imranal/Work/FEniCS/lib/python2.7/site-packages/h5py-2.3.0-py2.7-linux-x86_64.egg/h5py/tests/test_file.py",
>  line 241, in test_mpio
> with File(fname, 'w', driver='mpio', comm=MPI.COMM_WORLD) as f:
> File 
> "/uio/hume/student-u29/imranal/Work/FEniCS/lib/python2.7/site-packages/h5py-2.3.0-py2.7-linux-x86_64.egg/h5py/hl/files.py",
>  line 222, in __init_
> fid = make_fid(name, mode, userblock_size, fapl)
> File 
> "/uio/hume/student-u29/imranal/Work/FEniCS/lib/python2.7/site-packages/h5py-2.3.0-py2.7-linux-x86_64.egg/h5py/_hl/files.py",
>  line 85, in make_fid
> fid = h5f.create(name, h5f.ACC_TRUNC, fapl=fapl, fcpl=fcpl)
> File "h5f.pyx", line 90, in h5py.h5f.create (h5py/h5f.c:2222)
> IOError: Unable to create file (Mpi_err_other: known error not in list)
> 
> Ran 306 tests in 0.382s
> 
> FAILED (errors=2, skipped=5, expected failures=2)
> I am able to use hdf5 just fine when I do not use the mpio driver along with 
> the mpi communicator.
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Reply via email to