Hello Roy,
I checked that there is no infinite loop.
Both ranks pass this->send (dest_processor_id, sendvec, type1, req, send_tag);
and hang on  this->receive (source_processor_id, recv, type2, recv_tag);

Both processes are sending zero elements, is this correct?

Could you, please, suggest a simple MPI test to mimic this situation? Then I can check the MPI implementation.
Thank you,
Michael.


On 09/10/2017 09:07 PM, Roy Stogner wrote:

On Sat, 9 Sep 2017, Michael Povolotskyi wrote:

(gdb) p  recv_tag
$2 = (const libMesh::Parallel::MessageTag &) @0x2ac9d6d3d980: {
  static invalid_tag = -2147483648, _tagvalue = -1, _comm = 0x0}
(gdb) p  recv_tag
$2 = (const libMesh::Parallel::MessageTag &) @0x2b9331482980: {
 static invalid_tag = -2147483648, _tagvalue = -1, _comm = 0x0}

So, does it look like a logic error or like a problem with openMPI installation?

In openmpi MPI_ANY_TAG==-1, so they're both set to the right thing,
not waiting for the a message that will never come.

Could you, after attaching the debugger, try stepping forward in both
processes, and make sure they're both hanging, not infinite looping?
---
Roy



------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Libmesh-users mailing list
Libmesh-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to