Re: [OMPI users] Parameters at run time

2019-10-20 Thread Gilles Gouaillardet via users
Raymond, In the case of UCX, you can mpirun --mca pml_base_verbose 10 ... If the pml/ucx component is used, then your app will run over UCX. If the pml/ob1 component is used, then you can mpirun --mca btl_base_verbose 10 ... btl/self should be used for communications to itself. if btl/uct

Re: [OMPI users] Error with OpenMPI: Could not resolve generic procedure mpi_irecv

2019-08-19 Thread Gilles Gouaillardet via users
gt; size = this%size_dim(this%gi)*this%size_dim(this%gj)*cs3 > if(this%is_exchange_off) then >call this%update_stats(size) >this%bf(:,:,1:cs3) = cmplx(0.,0.) > else >call MPI_Irecv(this%bf(:,:,1:cs3),size,MPI_COMPLEX_TYPE,& > t

Re: [OMPI users] Error with OpenMPI: Could not resolve generic procedure mpi_irecv

2019-08-19 Thread Gilles Gouaillardet via users
> -- > > > On Mon, Aug 19, 2019 at 3:25 PM Gilles Gouaillardet via users > wrote: >> >> One more thing ... >> >> Your initial message mentioned a failure with gcc 8.2.0, but your >> follow-up message mentions LLVM compiler. >> >> So whi

Re: [OMPI users] Error with OpenMPI: Could not resolve generic procedure mpi_irecv

2019-08-19 Thread Gilles Gouaillardet via users
One more thing ... Your initial message mentioned a failure with gcc 8.2.0, but your follow-up message mentions LLVM compiler. So which compiler did you use to build Open MPI that fails to build your test ? Cheers, Gilles On Mon, Aug 19, 2019 at 6:49 PM Gilles Gouaillardet wrote: > >

Re: [OMPI users] Error with OpenMPI: Could not resolve generic procedure mpi_irecv

2019-08-19 Thread Gilles Gouaillardet via users
Thanks, and your reproducer is ? Cheers, Gilles On Mon, Aug 19, 2019 at 6:42 PM Sangam B via users wrote: > > Hi, > > OpenMPI is configured as follows: > > export CC=`which clang` > export CXX=`which clang++` > export FC=`which flang` > export F90=`which flang` > > ../configure

Re: [OMPI users] Error with OpenMPI: Could not resolve generic procedure mpi_irecv

2019-08-19 Thread Gilles Gouaillardet via users
Hi, Can you please post a full but minimal example that evidences the issue? Also please post your Open MPI configure command line. Cheers, Gilles Sent from my iPod > On Aug 19, 2019, at 18:13, Sangam B via users > wrote: > > Hi, > > I get following error if the application is compiled

Re: [OMPI users] OMPI was not built with SLURM's PMI support

2019-08-08 Thread Gilles GOUAILLARDET via users
Hi, You need to configure --with-pmi ... Cheers, Gilles On August 8, 2019, at 11:28 PM, Jing Gong via users wrote: Hi, Recently our Slurm system has been upgraded to 19.0.5. I tried to recompile openmpi v3.0 due to the bug reported in https://bugs.schedmd.com/show_bug.cgi?id=6993

Re: [OMPI users] OpenMPI 2.1.1 bug on Ubuntu 18.04.2 LTS

2019-08-01 Thread Gilles Gouaillardet via users
Juanchao, Is the issue related to https://github.com/open-mpi/ompi/pull/4501 ? Jeff, you might have to configure with --enable-heterogeneous to evidence the issue Cheers, Gilles On 8/2/2019 4:06 AM, Jeff Squyres (jsquyres) via users wrote: I am able to replicate the issue on a

Re: [OMPI users] When is it save to free the buffer after MPI_Isend?

2019-07-27 Thread Gilles Gouaillardet via users
Carlos, MPI_Isend() does not automatically frees the buffer after it sends the message. (it simply cannot do it since the buffer might be pointing to a global variable or to the stack). Can you please extract a reproducer from your program ? Out of curiosity, what if you insert an (useless)

Re: [OMPI users] How is the rank determined (Open MPI and Podman)

2019-07-22 Thread Gilles Gouaillardet via users
that Podman is running rootless. I will continue to investigate, but now I know where to look. Thanks! Adrian On Fri, Jul 12, 2019 at 06:48:59PM +0900, Gilles Gouaillardet via users wrote: Adrian, Can you try mpirun --mca btl_vader_copy_mechanism none ... Please double check the MCA

Re: [OMPI users] How it the rank determined (Open MPI and Podman)

2019-07-12 Thread Gilles Gouaillardet via users
t;--> Process # 0 of 2 is alive. ->test1 >>--> Process # 1 of 2 is alive. ->test2 >> >> I need to tell Podman to mount /tmp from the host into the container, as >> I am running rootless I also need to tell Podman to use the same user ID &

Re: [OMPI users] How it the rank determined (Open MPI and Podman)

2019-07-11 Thread Gilles Gouaillardet via users
Adrian, the MPI application relies on some environment variables (they typically start with OMPI_ and PMIX_). The MPI application internally uses a PMIx client that must be able to contact a PMIx server (that is included in mpirun and the orted daemon(s) spawned on the remote hosts).

Re: [OMPI users] Naming scheme of PSM2 and Vader shared memory segments

2019-07-07 Thread Gilles Gouaillardet via users
Sebastian, the PSM2 shared memory segment name is set by the PSM2 library and my understanding is that Open MPI has no control over it. If you believe the root cause of the crash is related to non unique PSM2 shared memory segment name, I guess you should report this at

Re: [OMPI users] Problems with MPI_Comm_spawn

2019-07-02 Thread Gilles Gouaillardet via users
Thanks for the report, this is indeed a bug I fixed at https://github.com/open-mpi/ompi/pull/6790 meanwhile, you can manually download and apply the patch at https://github.com/open-mpi/ompi/pull/6790.patch Cheers, Gilles On 7/3/2019 1:30 AM, Gyevi-Nagy László via users wrote: Hi, I

Re: [OMPI users] Possible bugs in MPI_Neighbor_alltoallv()

2019-06-27 Thread Gilles Gouaillardet via users
Thanks Junchao, I issued https://github.com/open-mpi/ompi/pull/6782 in order to fix this (and the alltoallw variant as well) Meanwhile, you can manually download and apply the patch at https://github.com/open-mpi/ompi/pull/6782.patch Cheers, Gilles On 6/28/2019 1:10 PM, Zhang,

Re: [OMPI users] undefined reference error related to ucx

2019-06-25 Thread Gilles Gouaillardet via users
tps://github.com/openucx/ucx/issues/3336 that the UCX 1.6 might solve this issue, so I tried the pre-release version to just check if it will. All the best, -- Passant From: users on behalf of Gilles Gouaillardet via users Sent: Tuesday, June 25, 2019 11

Re: [OMPI users] undefined reference error related to ucx

2019-06-25 Thread Gilles Gouaillardet via users
Passant, UCX 1.6.0 is not yet officially released, and it seems Open MPI (4.0.1) does not support it yet, and some porting is needed. Cheers, Gilles On Tue, Jun 25, 2019 at 5:13 PM Passant A. Hafez via users wrote: > > Hello, > > > I'm trying to build ompi 4.0.1 with external ucx 1.6.0 but

Re: [OMPI users] error running mpirun command

2019-05-03 Thread Gilles Gouaillardet via users
Eric, which version of Open MPI are you using ? how many hosts in your hostsfile ? The error message suggests this could be a bug within Open MPI, and a potential workaround for you would be to try mpirun -np 84 - -hostfile hostsfile --mca routed direct ./openmpi_hello.c You might also want to

Re: [OMPI users] 3.0.4, 4.0.1 build failure on OSX Mojave with LLVM

2019-04-24 Thread Gilles Gouaillardet via users
John, what if you move some parameters to CPPFLAGS and CXXCPPFLAGS (see the new configure command line below) Cheers, Gilles '/Users/cary/projects/ulixesall-llvm/builds/openmpi-4.0.1/nodl/../configure' \ --prefix=/Volumes/GordianStorage/opt/contrib-llvm7_appleclang/openmpi-4.0.1-nodl \

Re: [OMPI users] fatal error: ac_nonexistent.h: No such file or directory (openmpi-4.0.0)

2019-04-20 Thread Gilles Gouaillardet via users
The root cause is configure cannot run a simple Fortran program (see the relevant log below) I suggest you export LD_LIBRARY_PATH=/share/apps/gcc-5.4.0/lib64:$LD_LIBRARY_PATH and then try again. Cheers, Gilles configure:44254: checking Fortran value of selected_int_kind(4) configure:44281: