Re: [OMPI users] Problem when installing Rmpi package in HPC cluster

2016-07-11 Thread Bennet Fauber
We have found that virtually all Rmpi jobs need to be started with $ mpirun -np 1 R CMD BATCH This is, as I understand it, because the first R will initialize the MPI environment and then when you create the cluster, it wants to be able to start the rest of the processes. When you

Re: [OMPI users] Problem when installing Rmpi package in HPC cluster

2016-07-11 Thread Gilles Gouaillardet
Note this is just a workaround, this simply disables the mxm mtl (e.g. Mellanox optimized infiniband driver). basically, there are two ways to run a single task mpi program (a.out) - mpirun -np 1 ./a.out (this is the "standard" way) - ./a.out (aka singleton mode) the logs you posted do not

Re: [OMPI users] Problem when installing Rmpi package in HPC cluster

2016-07-11 Thread pan yang
Dear Gilles, I tried export OMPI_MCA_pml=ob1, and it worked! Thank you very much for your brilliant suggestion. By the way, I don't really understand what do you mean by '*can you also extract the command tha launch the test ?*'... Cheers, Pan

Re: [OMPI users] Problem when installing Rmpi package in HPC cluster

2016-07-11 Thread Gilles Gouaillardet
That could be specific to mtl/mxm could you export OMPI_MCA_pml=ob1 and try again ? can you also extract the command tha launch the test ? I am curious whether this is via mpirun or as a singleton Cheers, Gilles On Monday, July 11, 2016, pan yang wrote: > Dear

[OMPI users] Problem when installing Rmpi package in HPC cluster

2016-07-11 Thread pan yang
Dear OpenMPI community, I faced this problem when I am installing the Rmpi: > install.packages('Rmpi',repos='http://cran.r-project.org ',configure.args=c( + '--with-Rmpi-include=/usr/mpi/gcc/openmpi-1.8.2/include/', + '--with-Rmpi-libpath=/usr/mpi/gcc/openmpi-1.8.2/lib64/', +