You should be able to use any MPI communicator with PETSc objects.

PETSC_COMM_WORLD/PETSC_COMM_SELF are just a couple of convinence
communicators PETSc provides.

Satish

On Tue, 31 Oct 2017, Neelam Patel wrote:

> Hello PETSc users,
> Working in Fortran, I created 2 disjoint communicators with MPI_Group 
> operations using PETSC_COMM_WORLD as the "base" comm. I created parallel 
> vectors on each communicator, and set values in them equal to their ranks on 
> PETSC_COMM_WORLD. Everything seems to be working fine. My question is: Is 
> this a valid thing to do? Or will it cause some undefined behaviour that I'm 
> not seeing right now?
> I somehow thought that communicators created this way were not 'visible' to 
> PETSc. Or is it that using PETSC_COMM_WORLD above eliminates that problem?
> Thank you in advance,Neelam
> 

Reply via email to