Re: [OMPI users] CUDA supported APIs

2019-08-19 Thread Zhang, Junchao via users
o:users-boun...@lists.open-mpi.org>> 代表 Zhang, Junchao via users mailto:users@lists.open-mpi.org>> 寄件日期: 2019年8月15日 上午 11:52:56 收件者: Open MPI Users mailto:users@lists.open-mpi.org>> 副本: Zhang, Junchao mailto:jczh...@mcs.anl.gov>> 主旨: Re: [OMPI users] CUDA supported APIs Anoth

Re: [OMPI users] CUDA supported APIs

2019-08-15 Thread Zhang, Junchao via users
Another question: If MPI_Allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[],const int displs[], MPI_Datatype recvtype, MPI_Comm comm) is cuda aware, are recvcounts, displs in CPU memory or GPU memory? --Junchao Zhang On Thu, Aug 15,

[OMPI users] CUDA supported APIs

2019-08-15 Thread Zhang, Junchao via users
Hi, Are the APIs at https://www.open-mpi.org/faq/?category=runcuda#mpi-apis-cuda latest? I could not find MPI_Neighbor_xxx and MPI_Reduce_local. Thanks. --Junchao Zhang ___ users mailing list users@lists.open-mpi.org

Re: [OMPI users] OpenMPI 2.1.1 bug on Ubuntu 18.04.2 LTS

2019-08-02 Thread Zhang, Junchao via users
>>> p 2.1.1-8 bionic 500 >>> >>> $ sudo apt-get install libopenmpi-dev=2.1.6 >>> Reading package lists... Done >>> Building dependency tree >>> Reading state information... Done >>> E: Version '2.1.6' for 'libopenmpi-dev' was no

Re: [OMPI users] OpenMPI 2.1.1 bug on Ubuntu 18.04.2 LTS

2019-08-02 Thread Zhang, Junchao via users
t; $ sudo apt-get install libopenmpi-dev=2.1.6 >> Reading package lists... Done >> Building dependency tree >> Reading state information... Done >> E: Version '2.1.6' for 'libopenmpi-dev' was not found >> >> --Junchao Zhang >> >> >> On Thu, Aug 1,

Re: [OMPI users] OpenMPI 2.1.1 bug on Ubuntu 18.04.2 LTS

2019-08-01 Thread Zhang, Junchao via users
... Done E: Version '2.1.6' for 'libopenmpi-dev' was not found --Junchao Zhang On Thu, Aug 1, 2019 at 1:15 PM Jeff Squyres (jsquyres) mailto:jsquy...@cisco.com>> wrote: Does the bug exist in Open MPI v2.1.6? > On Jul 31, 2019, at 2:19 PM, Zhang, Junchao via users > mailto:users@lists.

[OMPI users] OpenMPI 2.1.1 bug on Ubuntu 18.04.2 LTS

2019-07-31 Thread Zhang, Junchao via users
Hello, I met a bug with OpenMPI 2.1.1 distributed in the latest Ubuntu 18.04.2 LTS. It happens with self to self send/recv using MPI_ANY_SOURCE for message matching. See the attached test code. You can reproduce it even with one process. It is a severe bug. Since this Ubuntu is widely

Re: [OMPI users] How to know how OpenMPI was built?

2019-07-31 Thread Zhang, Junchao via users
Did not find "Configure command line" in OpenMPI 2.1.1, but found it in 3.1.4. But that is OK. Thanks. --Junchao Zhang On Tue, Jul 30, 2019 at 5:10 PM Jeff Squyres (jsquyres) mailto:jsquy...@cisco.com>> wrote: Run ompi_info. > On Jul 30, 2019, at 5:57 PM, Zhang

[OMPI users] How to know how OpenMPI was built?

2019-07-30 Thread Zhang, Junchao via users
Hello, On a system with pre-installed OpenMPI, how to know the configure options used to build OpenMPI (so that I can build from source myself with the same options)? Thanks --Junchao Zhang ___ users mailing list users@lists.open-mpi.org

[OMPI users] Compilation errors with SunOS and Sun CC

2019-07-09 Thread Zhang, Junchao via users
Hello, I compiled OpenMPI 4.0.1 & 3.1.4 on "SunOS 5.11 illumos-a22312a201 i86pc i386 i86pc" with "cc: Sun C 5.10 SunOS_i386 2009/06/03". I met many errors, including "evutil_rand.c", line 68: void function cannot return value cc: acomp failed for evutil_rand.c gmake[5]: *** [Makefile:772:

Re: [OMPI users] Possible bugs in MPI_Neighbor_alltoallv()

2019-06-28 Thread Zhang, Junchao via users
to fix this (and the alltoallw variant as well) Meanwhile, you can manually download and apply the patch at https://github.com/open-mpi/ompi/pull/6782.patch Cheers, Gilles On 6/28/2019 1:10 PM, Zhang, Junchao via users wrote: > Hello, > When I do MPI_Neighbor_alltoallv or MPI_Ineighbor_alltoa

[OMPI users] Possible bugs in MPI_Neighbor_alltoallv()

2019-06-27 Thread Zhang, Junchao via users
Hello, When I do MPI_Neighbor_alltoallv or MPI_Ineighbor_alltoallv, I find when either outdegree or indegree is zero, OpenMPI will return an error. The suspicious code is at pneighbor_alltoallv.c / pineighbor_alltoallv.c 101 } else if ((NULL == sendcounts) || (NULL == sdispls) || 102