> On Oct 31, 2017, at 6:00 PM, Neelam Patel <[email protected]> wrote: > > Hello PETSc users, > > Working in Fortran, I created 2 disjoint communicators with MPI_Group > operations using PETSC_COMM_WORLD as the "base" comm. I created parallel > vectors on each communicator, and set values in them equal to their ranks on > PETSC_COMM_WORLD. Everything seems to be working fine. My question is: Is > this a valid thing to do? Or will it cause some undefined behaviour that I'm > not seeing right now? > > I somehow thought that communicators created this way were not 'visible' to > PETSc. Or is it that using PETSC_COMM_WORLD above eliminates that problem?
As you know each PETSc object created with a communicator "lives" on that that communicator, so certainly PETSc "knows" about the communicators you pass in to create objects and works with them. What you did seems completely reasonable and should work. I am not sure what your question is? Barry > > Thank you in advance, > Neelam
