I believe I made the change discussed above to my local version of fipy,
and I no longer get the error when calling the globalValue attribute of a
mesh.

I now notice that the .globalValue of a cellVariable seems to work as I
would imagine, while for faceVariables the globalVariable still doesn't
give me the whole domain.

The attached script prints the .shape of .value and .globalValue on each
processor (see output below).

Is something wrong, or is there a different/better way to do this?

Thanks,

Kris

---

> mpirun -n 1 python test.py
hello from 0 out of 1
0 cellVariable (400,) (400,)

0 faceVariable (840,) (840,)

0 center coordinates (2, 400) (2, 400)

0 face coordinates (2, 840) (2, 840)

>mpirun -n 2 python test.py
hello from 0 out of 2
hello from 1 out of 2
0 cellVariable (240,) (400,)

1 cellVariable (240,) (400,)

0 faceVariable (512,) (512,)

1 faceVariable (512,) (512,)

1 center coordinates (2, 240) (2, 400)

0 center coordinates (2, 240) (2, 400)

0 face coordinates (2, 512) (2, 512)
1 face coordinates (2, 512) (2, 512)






On Fri, Apr 29, 2016 at 12:26 PM, Guyer, Jonathan E. Dr. (Fed) <
[email protected]> wrote:

> Absolutely
>
> > On Apr 29, 2016, at 11:42 AM, Kris Kuhlman <[email protected]>
> wrote:
> >
> > Thanks for figuring this out. Will the patched fipy version be available
> from the github repository?
> >
> > Kris
> >
> > On Wed, Apr 27, 2016 at 3:17 PM, Keller, Trevor (Fed) <
> [email protected]> wrote:
> > Looking into the rest of the FiPy source, we're already calling
> allgather(sendobj) in several places, and rarely calling allgather(sendobj,
> recvobj). To preserve the existing function calls (all of which are
> lower-case) and mess with the code the least, removing the recvobj argument
> appears to be the right call after all.
> >
> > Working on the PR.
> >
> > Trevor
> >
> > ________________________________________
> > From: [email protected] <[email protected]> on behalf of Guyer,
> Jonathan E. Dr. (Fed) <[email protected]>
> > Sent: Wednesday, April 27, 2016 4:39:05 PM
> > To: FIPY
> > Subject: Re: globalValue in parallel
> >
> > It sounds like you're volunteering to put together the pull request with
> appropriate tests
> >
> > > On Apr 27, 2016, at 4:06 PM, Keller, Trevor (Fed) <
> [email protected]> wrote:
> > >
> > > The mpi4py commit mentions that the receive object is no longer needed
> for the lower-case form of the commands. Browsing the full source shows
> that the upper-case commands retain both the send and receive objects. To
> avoid deviating too far from the MPI standard, I'd like to suggest changing
> the case (Allgather instead of allgather), rather than dropping buffers, in
> our mpi4pyCommWrapper.py.
> > >
> > > Trevor
> > >
> > >
> > > ________________________________________
> > > From: [email protected] <[email protected]> on behalf of
> Guyer, Jonathan E. Dr. (Fed) <[email protected]>
> > > Sent: Wednesday, April 27, 2016 3:53:39 PM
> > > To: FIPY
> > > Subject: Re: globalValue in parallel
> > >
> > > It looks like 'recvobj' was removed from mpi4py about two years ago:
> > >
> > >
> https://bitbucket.org/mpi4py/mpi4py/commits/3d8503a11d320dd1c3030ec0dbce95f63b0ba602
> > >
> > > but I'm not sure when it made it into the released version.
> > >
> > >
> > > It looks like you can safely edit
> fipy/tools/comms/mpi4pyCommWrapper.py to remove the 'recvobj' argument.
> > >
> > >
> > > We'll do some tests and push a fix as soon as possible. Thanks for
> alerting us to the issue.
> > >
> > > Filed as https://github.com/usnistgov/fipy/issues/491
> > >
> > >
> > >> On Apr 27, 2016, at 2:23 PM, Kris Kuhlman <
> [email protected]> wrote:
> > >>
> > >> I built the trilinos-capable version of fipy. It seems to work for
> serial (even for a non-trivial case), but I am getting errors with more
> than one processor with a simple call to globalValue(), which I was trying
> to use to make a plot by gathering the results to procID==0
> > >>
> > >> I used the latest git version of mpi4py and trilinos. Am I doing
> something wrong (is there a different preferred way to gather things to a
> single processor to save or make plots?) or do I need to use a specific
> version of these packages and rebuild?  It seems the function is expecting
> something with a different interface or call structure.
> > >>
> > >> Kris
> > >>
> > >> python test.py
> > >> hello from 0 out of 1 [ 1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.  1.  1.  1.  1.  1.  1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.]
> > >>
> > >> `--> ~/local/trilinos-fipy/anaconda/bin/mpirun -np 1 python test.py
> > >> hello from 0 out of 1 [ 1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.  1.  1.  1.  1.  1.  1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
> 1.
> > >>  1.  1.]
> > >>
> > >> --> ~/local/trilinos-fipy/anaconda/bin/mpirun -np 2 python test.py
> > >> hello from 1 out of 2
> > >> Traceback (most recent call last):
> > >>  File "test.py", line 6, in <module>
> > >>    print 'hello from',fp.tools.parallel.procID,'out
> of',fp.tools.parallel.Nproc,p.globalValue
> > >>  File
> "/home/klkuhlm/local/trilinos-fipy/anaconda/lib/python2.7/site-packages/fipy/variables/cellVariable.py",
> line 163, in globalValue
> > >>    self.mesh._globalNonOverlappingCellIDs)
> > >>  File
> "/home/klkuhlm/local/trilinos-fipy/anaconda/lib/python2.7/site-packages/fipy/variables/meshVariable.py",
> line 171, in _getGlobalValue
> > >>    globalIDs =
> numerix.concatenate(self.mesh.communicator.allgather(globalIDs))
> > >>  File
> "/home/klkuhlm/local/trilinos-fipy/anaconda/lib/python2.7/site-packages/fipy/tools/comms/mpi4pyCommWrapper.py",
> line 75, in allgather
> > >>    return self.mpi4py_comm.allgather(sendobj=sendobj, recvobj=recvobj)
> > >>  File "MPI/Comm.pyx", line 1288, in mpi4py.MPI.Comm.allgather
> (src/mpi4py.MPI.c:109141)
> > >> TypeError: allgather() got an unexpected keyword argument 'recvobj'
> > >> hello from 0 out of 2
> > >> Traceback (most recent call last):
> > >>  File "test.py", line 6, in <module>
> > >>    print 'hello from',fp.tools.parallel.procID,'out
> of',fp.tools.parallel.Nproc,p.globalValue
> > >>  File
> "/home/klkuhlm/local/trilinos-fipy/anaconda/lib/python2.7/site-packages/fipy/variables/cellVariable.py",
> line 163, in globalValue
> > >>    self.mesh._globalNonOverlappingCellIDs)
> > >>  File
> "/home/klkuhlm/local/trilinos-fipy/anaconda/lib/python2.7/site-packages/fipy/variables/meshVariable.py",
> line 171, in _getGlobalValue
> > >>    globalIDs =
> numerix.concatenate(self.mesh.communicator.allgather(globalIDs))
> > >>  File
> "/home/klkuhlm/local/trilinos-fipy/anaconda/lib/python2.7/site-packages/fipy/tools/comms/mpi4pyCommWrapper.py",
> line 75, in allgather
> > >>    return self.mpi4py_comm.allgather(sendobj=sendobj, recvobj=recvobj)
> > >>  File "MPI/Comm.pyx", line 1288, in mpi4py.MPI.Comm.allgather
> (src/mpi4py.MPI.c:109141)
> > >> TypeError: allgather() got an unexpected keyword argument 'recvobj'
> > >> -------------------------------------------------------
> > >> Primary job  terminated normally, but 1 process returned
> > >> a non-zero exit code.. Per user-direction, the job has been aborted.
> > >> -------------------------------------------------------
> > >>
> --------------------------------------------------------------------------
> > >> mpirun detected that one or more processes exited with non-zero
> status, thus causing
> > >> the job to be terminated. The first process to do so was:
> > >>
> > >>  Process name: [[1719,1],1]
> > >>  Exit code:    1
> > >>
> --------------------------------------------------------------------------
> > >>
> > >> <test.py>_______________________________________________
> > >> fipy mailing list
> > >> [email protected]
> > >> http://www.ctcms.nist.gov/fipy
> > >> [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
> > >
> > >
> > > _______________________________________________
> > > fipy mailing list
> > > [email protected]
> > > http://www.ctcms.nist.gov/fipy
> > >  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
> > >
> > > _______________________________________________
> > > fipy mailing list
> > > [email protected]
> > > http://www.ctcms.nist.gov/fipy
> > >  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
> >
> >
> > _______________________________________________
> > fipy mailing list
> > [email protected]
> > http://www.ctcms.nist.gov/fipy
> >   [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
> >
> > _______________________________________________
> > fipy mailing list
> > [email protected]
> > http://www.ctcms.nist.gov/fipy
> >   [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
> >
> > _______________________________________________
> > fipy mailing list
> > [email protected]
> > http://www.ctcms.nist.gov/fipy
> >  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
>
>
> _______________________________________________
> fipy mailing list
> [email protected]
> http://www.ctcms.nist.gov/fipy
>   [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
>
import fipy as fp

m = fp.Grid2D(nx=20,ny=20)
p = fp.CellVariable(m,value=1.0)
k = fp.FaceVariable(m,value=2.0)

ID = fp.tools.parallel.procID

print 'hello from',ID,'out of',fp.tools.parallel.Nproc

print ID,'cellVariable',p.value.shape,p.globalValue.shape
print
print ID,'faceVariable',k.value.shape,k.globalValue.shape
print
print ID,'center coordinates',m.cellCenters.value.shape,m.cellCenters.globalValue.shape
print
print ID,'face coordinates',m.faceCenters.value.shape,m.faceCenters.globalValue.shape


_______________________________________________
fipy mailing list
[email protected]
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to