Lisandro,

Good catch. Indeed the MPI_Ireduce_scatter was not covering the case where
MPI_IN_PLACE was used over a communicator with a single participant. I
pushed a patch and schedule it for 1.8.4. Check
https://svn.open-mpi.org/trac/ompi/ticket/4924 for more info.

Thanks,
  George.


On Sun, Sep 28, 2014 at 6:29 AM, Lisandro Dalcin <dalc...@gmail.com> wrote:

> On 22 April 2014 03:02, George Bosilca <bosi...@icl.utk.edu> wrote:
> > Btw, the proposed validator was incorrect the first printf instead of
> >
> >  printf(ā€œ[%d] rbuf[%d]=%2d  expected:%2d\nā€, rank, 0, recvbuf[i], size);
> >
> > should be
> >
> >  printf(ā€œ[%d] rbuf[%d]=%2d  expected:%2d\nā€, rank, 0, recvbuf[0], size);
> >
>
> I'm testing this with 1.8.3 after fixed the my incorrect printf, and
> still get different results (and the nbcoll one is wrong) using one
> process (for two or more everything's OK).
>
> $ mpicc -DNBCOLL=0 ireduce_scatter.c && mpiexec -n 1 ./a.out
> [0] rbuf[0]= 1  expected: 1
>
> $ mpicc -DNBCOLL=1 ireduce_scatter.c && mpiexec -n 1 ./a.out
> [0] rbuf[0]=60  expected: 1
>
>
> --
> Lisandro Dalcin
> ============
> Research Scientist
> Computer, Electrical and Mathematical Sciences & Engineering (CEMSE)
> Numerical Porous Media Center (NumPor)
> King Abdullah University of Science and Technology (KAUST)
> http://numpor.kaust.edu.sa/
>
> 4700 King Abdullah University of Science and Technology
> al-Khawarizmi Bldg (Bldg 1), Office # 4332
> Thuwal 23955-6900, Kingdom of Saudi Arabia
> http://www.kaust.edu.sa
>
> Office Phone: +966 12 808-0459
>

Reply via email to