Dear PETSc dev team,
   If I want PETSc to use my own parallel partition(instead of PETSc
"evenly" partition rows), I assume I can provide local rows/columns as
follows.

MatCreateShell 
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateShell.html#MatCreateShell>(MPI_Comm
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/MPI_Comm.html#MPI_Comm>
comm,PetscInt 
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscInt.html#PetscInt>
m,PetscInt 
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscInt.html#PetscInt>
n,PetscInt 
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscInt.html#PetscInt>
M,PetscInt 
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscInt.html#PetscInt>
N,void *ctx,Mat
<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/Mat.html#Mat>
*A)

  If that's the case, what role does local columns play? Memory allocation?
Should I use n (the global rows) or local rows or PETSC_DECIDE in this case?

Thanks,
Sam

Reply via email to