Lisandro,

i fixed this in the master and made a PR for v1.8.

this is a one liner, and you can find it at
https://github.com/ggouaillardet/ompi-release/commit/0e478c1191715fff37e4deb56f8891774db62775

Cheers,

Gilles

On 2014/12/23 23:43, Lisandro Dalcin wrote:
> On 28 September 2014 at 19:13, George Bosilca <bosi...@icl.utk.edu> wrote:
>> Lisandro,
>>
>> Good catch. Indeed the MPI_Ireduce_scatter was not covering the case where
>> MPI_IN_PLACE was used over a communicator with a single participant. I
>> pushed a patch and schedule it for 1.8.4. Check
>> https://svn.open-mpi.org/trac/ompi/ticket/4924 for more info.
>>
> While your change fixed the issues when using MPI_IN_PLACE, now 1.8.4
> seems to fail when in-place is not used.
>
> Please try the attached example:
>
> $ mpicc -DNBCOLL=0 ireduce_scatter.c
> $ mpiexec -n 2 ./a.out
> [0] rbuf[0]= 2  expected: 2
> [0] rbuf[1]= 0  expected: 0
> [1] rbuf[0]= 2  expected: 2
> [1] rbuf[1]= 0  expected: 0
> $ mpiexec -n 1 ./a.out
> [0] rbuf[0]= 1  expected: 1
>
>
> $ mpicc -DNBCOLL=1 ireduce_scatter.c
> $ mpiexec -n 2 ./a.out
> [0] rbuf[0]= 2  expected: 2
> [0] rbuf[1]= 0  expected: 0
> [1] rbuf[0]= 2  expected: 2
> [1] rbuf[1]= 0  expected: 0
> $ mpiexec -n 1 ./a.out
> [0] rbuf[0]= 0  expected: 1
>
> The last one is wrong. Not sure what's going on. Am I missing something?
>
>
>
>
> _______________________________________________
> devel mailing list
> de...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/devel
> Link to this post: 
> http://www.open-mpi.org/community/lists/devel/2014/12/16718.php

Reply via email to