Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-16 Thread Cabral, Matias A
ROTHE Eduardo - externe Sent: Wednesday, January 16, 2019 9:29 AM To: Open MPI Users Subject: Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send Hi Matias, thanks so much for your support! Actually running this simple example with --mca mtl_ofi_tag_mode ofi_tag_1 turns out to be a

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-16 Thread ROTHE Eduardo - externe
suggesting that upgrading libfabric to 1.6.0 might save the day? Regards, Eduardo De : users de la part de matias.a.cab...@intel.com Envoyé : mercredi 16 janvier 2019 00:54 À : Open MPI Users Objet : Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-15 Thread Cabral, Matias A
On Behalf Of ROTHE Eduardo - externe Sent: Tuesday, January 15, 2019 2:31 AM To: Open MPI Users Subject: Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send Hi Matias, Thank you so much for your feedback! It's really embarrassing, but running mpirun -np 2 -mca mtl ofi -mca pml

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-15 Thread ROTHE Eduardo - externe
his be related? Regards, Eduardo De : users de la part de matias.a.cab...@intel.com Envoyé : samedi 12 janvier 2019 00:32 À : Open MPI Users Objet : Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send BTW, just to be explicit about using the psm2 OFI provider: /tmp>

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-11 Thread Cabral, Matias A
rom: users [mailto:users-boun...@lists.open-mpi.org] On Behalf Of Cabral, Matias A Sent: Friday, January 11, 2019 3:22 PM To: Open MPI Users Subject: Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send Hi Eduardo, The OFI MTL got some new features during 2018 that went into v4.0.0 but are

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-11 Thread Cabral, Matias A
1 out of 2 Process 1 received number 10 from process 0 From: users [mailto:users-boun...@lists.open-mpi.org] On Behalf Of ROTHE Eduardo - externe Sent: Thursday, January 10, 2019 10:02 AM To: Open MPI Users Subject: Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send Hi Gilles, thank you

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-10 Thread ROTHE Eduardo - externe
r...@gmail.com Envoyé : jeudi 10 janvier 2019 13:51 À : Open MPI Users Objet : Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send Eduardo, You have two options to use OmniPath - “directly” via the psm2 mtl mpirun —mca pml cm —mca mtl psm2 ... - “indirectly” via libfabric mpirun —mca pml c

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-10 Thread Peter Kjellström
On Thu, 10 Jan 2019 21:51:03 +0900 Gilles Gouaillardet wrote: > Eduardo, > > You have two options to use OmniPath > > - “directly” via the psm2 mtl > mpirun —mca pml cm —mca mtl psm2 ... > > - “indirectly” via libfabric > mpirun —mca pml cm —mca mtl ofi ... > > I do invite you to try both. By

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-10 Thread Gilles Gouaillardet
---- > *De :* users de la part de > gilles.gouaillar...@gmail.com > *Envoyé :* mercredi 9 janvier 2019 15:16 > *À :* Open MPI Users > *Objet :* Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send > > Eduardo, > > The first part of the configure command line is for a

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-10 Thread Peter Kjellström
On Thu, 10 Jan 2019 11:20:12 + ROTHE Eduardo - externe wrote: > Hi Gilles, thank you so much for your support! > > For now I'm just testing the software, so it's running on a single > node. > > Your suggestion was very precise. In fact, choosing the ob1 component > leads to a successfull ex

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-10 Thread ROTHE Eduardo - externe
___ De : users de la part de gilles.gouaillar...@gmail.com Envoyé : mercredi 9 janvier 2019 15:16 À : Open MPI Users Objet : Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send Eduardo, The first part of the configure command line is for an install in /usr, but then there is ‘—pref

Re: [OMPI users] Open MPI 4.0.0 - error with MPI_Send

2019-01-09 Thread Gilles Gouaillardet
Eduardo, The first part of the configure command line is for an install in /usr, but then there is ‘—prefix=/opt/openmpi/4.0.0’ and this is very fishy. You should also use ‘—with-hwloc=external’. How many nodes are you running on and which interconnect are you using ? What if you mpirun —mca pml