I rebuilt letting fenics build petsc instead of using the system petsc, and the above error is gone. Fenics compiles, and I can load dolfin in python. However, if I try to run a test-case, any test-case, I get this error:

[t-cn0109.hpc2n.umu.se:31129] [[INVALID],INVALID] ORTE_ERROR_LOG: Attempt to redefine an existing data type in file ../../../orte/runtime/orte_globals.c at line 189 [t-cn0109.hpc2n.umu.se:31130] [[INVALID],INVALID] ORTE_ERROR_LOG: Attempt to redefine an existing data type in file ../../../orte/runtime/orte_globals.c at line 189 [t-cn0109.hpc2n.umu.se:31131] [[INVALID],INVALID] ORTE_ERROR_LOG: Attempt to redefine an existing data type in file ../../../orte/runtime/orte_globals.c at line 189 [t-cn0109.hpc2n.umu.se:31132] [[INVALID],INVALID] ORTE_ERROR_LOG: Attempt to redefine an existing data type in file ../../../orte/runtime/orte_globals.c at line 189
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

 orte_dt_init failed
--> Returned value Attempt to redefine an existing data type (-31) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

 orte_dt_init failed
--> Returned value Attempt to redefine an existing data type (-31) instead of ORTE_SUCCESS


... (repeated)


If my version of MPI a problem? The compilers, perhaps? I have compiled with GCC 4.6.3 and OpenMPI 1.6.5.

Attaching my yaml file.



I took a look at the libraries linked to dolfin, and it looks as if there is indeed a MPI problem - it links in not just the MPI version I want it to use (OpenMPI 1.6.5 which is installed as a module), but also the one installed with the OS. This causes some sort of conflict I am guessing, but I have absolutely no idea how to stop this from happening.

This is the output from ldd libdolfin.so

        linux-vdso.so.1 =>  (0x00007fff691d0000)
libxml2.so.2 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/libxml2/ohwsguokc2rl/lib/libxml2.so.2 (0x00002b7d50c58000) libboost_filesystem.so.1.55.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/boost/pgpg4cjabhlf/lib/libboost_filesystem.so.1.55.0 (0x00002b7d50fc0000) libboost_program_options.so.1.55.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/boost/pgpg4cjabhlf/lib/libboost_program_options.so.1.55.0 (0x00002b7d511d8000) libboost_system.so.1.55.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/boost/pgpg4cjabhlf/lib/libboost_system.so.1.55.0 (0x00002b7d51450000) libboost_thread.so.1.55.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/boost/pgpg4cjabhlf/lib/libboost_thread.so.1.55.0 (0x00002b7d51658000) libboost_iostreams.so.1.55.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/boost/pgpg4cjabhlf/lib/libboost_iostreams.so.1.55.0 (0x00002b7d51870000) libhdf5.so.8 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/hdf5/5wwfxxazmvaz/lib/libhdf5.so.8 (0x00002b7d51a88000) libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00002b7d51f78000) libz.so.1 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/zlib/ybkphsca7v4h/lib/libz.so.1 (0x00002b7d52198000) libml.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libml.so.11 (0x00002b7d523b0000) libzoltan.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libzoltan.so.11 (0x00002b7d528c0000) libifpack.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libifpack.so.11 (0x00002b7d52bb8000) libamesos.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libamesos.so.11 (0x00002b7d52f98000) libepetraext.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libepetraext.so.11 (0x00002b7d53270000) libbelosepetra.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libbelosepetra.so.11 (0x00002b7d53580000) libbelos.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libbelos.so.11 (0x00002b7d538c0000) libepetra.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libepetra.so.11 (0x00002b7d53ac8000) libkokkosdisttsqr.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libkokkosdisttsqr.so.11 (0x00002b7d53e30000) libkokkosnodetsqr.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libkokkosnodetsqr.so.11 (0x00002b7d54048000) libteuchosnumerics.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libteuchosnumerics.so.11 (0x00002b7d542c0000) libteuchoscomm.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libteuchoscomm.so.11 (0x00002b7d544e8000) libteuchosparameterlist.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libteuchosparameterlist.so.11 (0x00002b7d547c8000) libteuchoscore.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libteuchoscore.so.11 (0x00002b7d54d20000) libopenblas.so.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/openblas/4lzg23k34ugh/lib/libopenblas.so.0 (0x00002b7d54f88000) libparmetis.so => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/parmetis/geuhf7fxgppt/lib/libparmetis.so (0x00002b7d55f70000)
        libmpi_cxx.so.0 => /usr/lib/libmpi_cxx.so.0 (0x00002b7d561f0000)
        libmpi.so.0 => /usr/lib/libmpi.so.0 (0x00002b7d56410000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00002b7d566c8000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00002b7d569c8000) libgomp.so.1 => /usr/lib/x86_64-linux-gnu/libgomp.so.1 (0x00002b7d56cc8000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00002b7d56ed8000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00002b7d570f0000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00002b7d574b0000) liblzma.so.5 => /usr/lib/x86_64-linux-gnu/liblzma.so.5 (0x00002b7d576b8000) librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00002b7d578e0000) libbz2.so.1.0 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/bzip2/36p53sj2wv7t/lib/libbz2.so.1.0 (0x00002b7d57af0000) libmpi.so.1 => /lap/openmpi/1.8.1/gcc-4.6/lib/libmpi.so.1 (0x00002b7d57d00000)
        /lib64/ld-linux-x86-64.so.2 (0x00002b7d50008000)
libgaleri.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libgaleri.so.11 (0x00002b7d57fd8000) libaztecoo.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libaztecoo.so.11 (0x00002b7d58220000) libmpi_cxx.so.1 => /lap/openmpi/1.8.1/gcc-4.6/lib/libmpi_cxx.so.1 (0x00002b7d584b8000) libteuchosremainder.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libteuchosremainder.so.11 (0x00002b7d586d8000) libtriutils.so.11 => /pfs/nobackup/home/b/bbrydsoe/fenics_5_build/bld/trilinos/qcaqlx2soyjs/lib/libtriutils.so.11 (0x00002b7d588f8000) libgfortran.so.3 => /usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00002b7d58b58000)
        libopen-rte.so.0 => /usr/lib/libopen-rte.so.0 (0x00002b7d58e70000)
libopen-rte.so.7 => /lap/openmpi/1.8.1/gcc-4.6/lib/libopen-rte.so.7 (0x00002b7d590c0000) libopen-pal.so.6 => /lap/openmpi/1.8.1/gcc-4.6/lib/libopen-pal.so.6 (0x00002b7d59340000) libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00002b7d59618000)
        libopen-pal.so.0 => /usr/lib/libopen-pal.so.0 (0x00002b7d59850000)
        libnuma.so.1 => /usr/lib/libnuma.so.1 (0x00002b7d59aa8000)
libpciaccess.so.0 => /usr/lib/x86_64-linux-gnu/libpciaccess.so.0 (0x00002b7d59cb8000) libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00002b7d59ec8000)



How do I set it in the yaml file so that fenics only grabs the MPI in

/lap/openmpi/1.8.1/gcc-4.6/

and not the one in

/usr/lib/


Thanks,

Birgitte

_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support

Reply via email to