Re: [OMPI devel] Segfault on MPI init

2017-03-30 Thread Cyril Bordage
Hello, when I went back to this problem, the segfault did not happen again... I do not know why, but I am glad with that. Cyril. Le 13/02/2017 à 10:15, Cyril Bordage a écrit : > Unfortunately this does not complete this thread. The problem is not > solved! It is not an installation problem. I h

Re: [OMPI devel] Segfault on MPI init

2017-02-21 Thread Christopher Samuel
On 15/02/17 00:45, Gilles Gouaillardet wrote: > i would expect orted generate a core, and then you can use gdb post > mortem to get the stack trace. > there should be several threads, so you can > info threads > bt > you might have to switch to an other thread You can also get a backtrace from al

Re: [OMPI devel] Segfault on MPI init

2017-02-14 Thread Gilles Gouaillardet
Cyril, your first post mentions a crash in orted, but the stack trace is the one of a MPI task. i would expect orted generate a core, and then you can use gdb post mortem to get the stack trace. there should be several threads, so you can info threads bt you might have to switch to an other threa

Re: [OMPI devel] Segfault on MPI init

2017-02-14 Thread Cyril Bordage
I have no MPI installation in my environment. If it was the case, would I have an error since I use the complete path for mpirun? I finally managed to get a backtrace: #0 0x77533f18 in _exit () from /lib64/libc.so.6 #1 0x75169d68 in rte_abort (status=-51, report=true) at ../../..

Re: [OMPI devel] Segfault on MPI init

2017-02-14 Thread Jeff Squyres (jsquyres)
You should also check your paths for non interactive remote logins and ensure that you are not accidentally mixing versions of open MPI (e.g., the new version in your local machine, and some other version on the remote machines). Sent from my phone. No type good. > On Feb 13, 2017, at 8:14 AM

Re: [OMPI devel] Segfault on MPI init

2017-02-13 Thread Gilles Gouaillardet
Cyril, Are you running your jobs via a batch manager If yes, was support for it correctly built ? If you were able to get a core dump, can you post the gdb stacktrace ? I guess your nodes have several IP interfaces, you might want to try mpirun --mca oob_tcp_if_include eth0 ... (replace eth0 wi

Re: [OMPI devel] Segfault on MPI init

2017-02-13 Thread Cyril Bordage
Unfortunately this does not complete this thread. The problem is not solved! It is not an installation problem. I have no previous installation since I use separate directories. I have nothing specific to MPI path in my env, I just use the complete path to mpicc and mpirun. The error depends on wh

Re: [OMPI devel] Segfault on MPI init

2017-02-10 Thread George Bosilca
To complete this thread, the problem is now solved. Some .so were lingering around from a previous installation causing startup pb. George. > On Feb 10, 2017, at 05:38 , Cyril Bordage wrote: > > Thank you for your answer. > I am running the git master version (last tested was cad4c03). > >

Re: [OMPI devel] Segfault on MPI init

2017-02-10 Thread Cyril Bordage
Thank you for your answer. I am running the git master version (last tested was cad4c03). FYI, Clément Foyer is talking with George Bosilca about this problem. Cyril. Le 08/02/2017 à 16:46, Jeff Squyres (jsquyres) a écrit : > What version of Open MPI are you running? > > The error is indicatin

Re: [OMPI devel] Segfault on MPI init

2017-02-08 Thread Jeff Squyres (jsquyres)
What version of Open MPI are you running? The error is indicating that Open MPI is trying to start a user-level helper daemon on the remote node, and the daemon is seg faulting (which is unusual). One thing to be aware of: https://www.open-mpi.org/faq/?category=building#install-overwrite

[OMPI devel] Segfault on MPI init

2017-02-06 Thread Cyril Bordage
Hello, I cannot run the a program with MPI when I compile it myself. On some nodes I have the following error: [mimi012:17730] *** Process received signal *** [mimi012:17730] Signal: Segmentation fault (11) [mimi012:1