Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread David Shrader
Interestingly enough, I have found that using --disable-dlopen causes the seg fault whether or not --enable-mca-no-build=coll-ml is used. That is, the following configure line generates a build of Open MPI that will *not* seg fault when running a simple hello world program: ./configure --prefi

Re: [OMPI users] segfault on java binding from MPI.init()

2015-08-13 Thread Howard Pritchard
Hi Nate, The odls output helps some. You have a really big CLASSPATH. Also there might be a small chance that the shmem.jar is causing problems. Could you try undefining your CLASSPATH just to run the test case? If the little test case still doesn't work, could you reconfigure the mpi build to

[OMPI users] connection time out (110)

2015-08-13 Thread Ehsan Moradi
hi my friends, i getting error connection time out (110) even after echo 1 > /proc/sys/net/core/somaxconn echo 10 > /proc/sys/net/core/netdev_max_backlog mpirun --mca oob_tcp_listen_mode listen_thread -np 1024 my_mpi_program my program work on 2 nodes only if i add one more its going t

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread Nathan Hjelm
David, to modify that option modify the toss-common file. It is in the same location as the platform file. We have a number of component we disable by default. Just add coll-ml to the end of the list. -Nathan On Thu, Aug 13, 2015 at 05:19:35PM +, Jeff Squyres (jsquyres) wrote: > Ah, if you'r

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread Jeff Squyres (jsquyres)
Ah, if you're disable-dlopen, then you won't find individual plugin DSOs. Instead, you can configure this way: ./configure --enable-mca-no-build=coll-ml ... This will disable the build of the coll/ml component altogether. > On Aug 13, 2015, at 11:23 AM, David Shrader wrote: > > Hey

Re: [OMPI users] open mpi upgrade

2015-08-13 Thread Gustavo Correa
Hi Ehsan You didn't tell the details of how you configured and installed Open MPI. However, you must point the configuration --prefix to the installation directory, say: ./configure --prefix=/opt/openmpi-1.8.8 In addition, the installation directory must be *different* from the directory where

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread Nathan Hjelm
David, our platform files disable dlopen. That is why you are not seeing any component files. coll/ml is built into libmpi.so. -Nathan On Thu, Aug 13, 2015 at 09:23:09AM -0600, David Shrader wrote: > Hey Jeff, > > I'm actually not able to find coll_ml related files at that location. All I > see

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread David Shrader
Hey Jeff, I'm actually not able to find coll_ml related files at that location. All I see are the following files: [dshrader@zo-fe1 openmpi]$ ls /usr/projects/hpcsoft/toss2/zorrillo/openmpi/1.8.8-gcc-4.4/lib/openmpi/ libompi_dbg_msgq.a libompi_dbg_msgq.la libompi_dbg_msgq.so In this parti

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread David Shrader
I don't have that option on the configure command line, but my platform file is using "enable_dlopen=no." I imagine that is getting the same result. Thank you for the pointer! Thanks, David On 08/12/2015 05:04 PM, Deva wrote: do you have "-disable-dlopen" in your configure option? This might

[OMPI users] Trouble with udcm and rdmacm

2015-08-13 Thread Tobias Kloeffel
Hi all, The configuration might be a bit exotic: Kernel 4.1.5 vanilla, Mellanox OFED 3.0-2.0.1 ccc174 1 x dual port ConnectX-3 mini4 2 x single port ConnectX-2 mini2 8 x single port ConnectX-2 MIS20025 The following does work: using oob coonection manager in 1.7.3: everything works, excep

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread Jeff Squyres (jsquyres)
Note that this will require you to have fairly recent GNU Autotools installed. Another workaround for avoiding the coll ml module would be to install Open MPI as normal, and then rm the following files after installation: rm $prefix/lib/openmpi/mca_coll_ml* This will physically remove the co

Re: [OMPI users] Open MPI 1.8.8 and hcoll in system space

2015-08-13 Thread Gilles Gouaillardet
David, i guess you do not want to use the ml coll module at all in openmpi 1.8.8 you can simply do touch ompi/mca/coll/ml/.ompi_ignore ./autogen.pl ./configure ... make && make install so the ml component is not even built Cheers, Gilles On 8/13/2015 7:30 AM, David Shrader wrote: I remember

Re: [OMPI users] open mpi upgrade

2015-08-13 Thread Gilles Gouaillardet
Ehsan, how did you try to install openmpi ? shall i assume you download a tarball, and ran configure && make install ? can you post the full commands you ran ? are you installing as root ? or did you run sudo make install ? if not, do you have write access to the /opt/openmpi-1.8.8 directory ?

[OMPI users] open mpi upgrade

2015-08-13 Thread Ehsan Moradi
hi, my dear friends i tried to upgrade my openmpi version from 1.2.8 to 1.8.8 but after installing it on different directory "/opt/openmpi-1.8.8/" when i enter mpirun its version is 1.2.8 and after installing the directory "/opt/openmpi-1.8.8/" is empty!! so what should i do for installing and usi