I let this small piece of code to demonstrate the basics.... If you believe
(like I do) that this is worth to be added to the available examples: feel free
to do it !...
Franck
----- Mail original -----
> De: "Franck Houssen" <[email protected]>
> À: "Matthew Knepley" <[email protected]>
> Cc: "PETSc" <[email protected]>, "PETSc" <[email protected]>
> Envoyé: Mardi 23 Mai 2017 18:28:03
> Objet: Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS matrix
> and a global vector ?
> OK, thanks. This is helpfull... But I really think the doc should be more
> verbose about that: this is really confusing and I didn't find any simple
> example to begin with which make all this even more confusing (personal
> opinion).
> Franck
> ----- Mail original -----
> > De: "Matthew Knepley" <[email protected]>
>
> > À: "Franck Houssen" <[email protected]>
>
> > Cc: "Stefano Zampini" <[email protected]>, "PETSc"
> > <[email protected]>, "PETSc" <[email protected]>
>
> > Envoyé: Mardi 23 Mai 2017 13:21:21
>
> > Objet: Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS matrix
> > and a global vector ?
>
> > On Tue, May 23, 2017 at 4:53 AM, Franck Houssen < [email protected] >
> > wrote:
>
> > > The first thing I did was to put 3, not 4 : I got an error thrown in
> > > MatCreateIS (see the git diff + stack below). As the error said I used
> > > globalSize = numberOfMPIProcessus * localSize : my understanding is that,
> > > when using MatIS, the global size needs to be the sum of all local sizes.
> > > Correct ?
> >
>
> > No. MatIS means that the matrix is not assembled. The easiest way (for me)
> > to
> > think of this is that processes do not have
>
> > to hold full rows. One process can hold part of row i, and another
> > processes
> > can hold another part. However, there are still
>
> > the same number of global rows.
>
> > > I have a 3x3 global matrix made of two overlapping 2x2 local matrix (=
> > > diagonal with 1.). Each local matrix correspond to one domain (each
> > > domain
> > > is delegated to one MPI proc, so, I have 2 MPI procs because I have 2
> > > domains).
> >
>
> > So the global size is 3. The local size here is not the size of the local
> > IS
> > block, since that is a property only of MatIS. It is the
>
> > size of the local piece of the vector you multiply. This allows PETSc to
> > understand the parallel layout of the Vec, and how it
>
> > matched the Mat.
>
> > This is somewhat confusing because FEM people mean something different by
> > "local" than we do here, and in fact we use this
>
> > other definition of local when assembling operators.
>
> > Matt
>
> > > This is the simplest possible example: I have two 2x2 (local) diag matrix
> > > that overlap so that the global matrix built from them is 1, 2, 1 on the
> > > diagonal (local contributions add up in the middle).
> >
>
> > > I need to MatMult this global matrix with a global vector filled with 1.
> >
>
> > > Franck
> >
>
> > > Git diff :
> >
>
> > > --- a/matISLocalMat.cpp
> >
>
> > > +++ b/matISLocalMat.cpp
> >
>
> > > @@ -16,7 +16,7 @@ int main(int argc,char **argv) {
> >
>
> > > int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size); if (size != 2) return
> > > 1;
> >
>
> > > int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);
> >
>
> > > - PetscInt localSize = 2, globalSize = localSize*2 /*2 MPI*/;
> >
>
> > > + PetscInt localSize = 2, globalSize = 3;
> >
>
> > > PetscInt localIdx[2] = {0, 0};
> >
>
> > > if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}
> >
>
> > > else {localIdx[0] = 1; localIdx[1] = 2;}
> >
>
> > > Stack error:
> >
>
> > > [0]PETSC ERROR: Nonconforming object sizes
> >
>
> > > [0]PETSC ERROR: Sum of local lengths 4 does not equal global length 3, my
> > > local length 2
> >
>
> > > [0]PETSC ERROR: [0] ISG2LMapApply line 17
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/vec/is/utils/isltog.c
> >
>
> > > [0]PETSC ERROR: [0] MatSetValues_IS line 692
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> >
>
> > > [0]PETSC ERROR: [0] MatSetValues line 1157
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> >
>
> > > [0]PETSC ERROR: [0] MatISSetPreallocation_IS line 95
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> >
>
> > > [0]PETSC ERROR: [0] MatISSetPreallocation line 80
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> >
>
> > > [0]PETSC ERROR: [0] PetscSplitOwnership line 80
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/sys/utils/psplit.c
> >
>
> > > [0]PETSC ERROR: [0] PetscLayoutSetUp line 129
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/vec/is/utils/pmap.c
> >
>
> > > [0]PETSC ERROR: [0] MatSetLocalToGlobalMapping_IS line 628
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> >
>
> > > [0]PETSC ERROR: [0] MatSetLocalToGlobalMapping line 1899
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/interface/matrix.c
> >
>
> > > [0]PETSC ERROR: [0] MatCreateIS line 986
> > > /home/fghoussen/Documents/INRIA/petsc-3.7.6/src/mat/impls/is/matis.c
> >
>
> > > > De: "Stefano Zampini" < [email protected] >
> > >
> >
>
> > > > À: "Matthew Knepley" < [email protected] >
> > >
> >
>
> > > > Cc: "Franck Houssen" < [email protected] >, "PETSc" <
> > > > [email protected] >, "PETSc" < [email protected] >
> > >
> >
>
> > > > Envoyé: Dimanche 21 Mai 2017 23:02:37
> > >
> >
>
> > > > Objet: Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS
> > > > matrix
> > > > and a global vector ?
> > >
> >
>
> > > > Franck,
> > >
> >
>
> > > > PETSc takes care of doing the matrix-vector multiplication properly
> > > > using
> > > > MatIS. As Matt said, the layout of the vectors is the usual parallel
> > > > layout.
> > >
> >
>
> > > > The local sizes of the MatIS matrix (i.e. the local size of the left
> > > > and
> > > > right vectors used in MatMult) are not the sizes of the local subdomain
> > > > matrices in MatIS.
> > >
> >
>
> > > > > On May 21, 2017, at 6:47 PM, Matthew Knepley < [email protected] >
> > > > > wrote:
> > > >
> > >
> >
>
> > > > > On Sun, May 21, 2017 at 11:26 AM, Franck Houssen <
> > > > > [email protected]
> > > > > >
> > > > > wrote:
> > > >
> > >
> >
>
> > > > > > Using PETSc MatIS, how to matmult a global IS matrix and a global
> > > > > > vector
> > > > > > ?
> > > > > > Example is attached : I don't get what I expect that is a vector
> > > > > > such
> > > > > > that
> > > > > > proc0 = [1, 2] and proc1 = [2, 1]
> > > > >
> > > >
> > >
> >
>
> > > > > 1) I think the global size of your matrix is wrong. You seem to want
> > > > > 3,
> > > > > not
> > > > > 4
> > > >
> > >
> >
>
> > > > > 2) Global vectors have a non-overlapping row partition. You might be
> > > > > thinking
> > > > > of local vectors
> > > >
> > >
> >
>
> > > > > Thanks,
> > > >
> > >
> >
>
> > > > > Matt
> > > >
> > >
> >
>
> > > > > --
> > > >
> > >
> >
>
> > > > > What most experimenters take for granted before they begin their
> > > > > experiments
> > > > > is infinitely more interesting than any results to which their
> > > > > experiments
> > > > > lead.
> > > >
> > >
> >
>
> > > > > -- Norbert Wiener
> > > >
> > >
> >
>
> > > > > http://www.caam.rice.edu/~mk51/
> > > >
> > >
> >
>
> > > > De: "Stefano Zampini" < [email protected] >
> > >
> >
>
> > > > À: "Matthew Knepley" < [email protected] >
> > >
> >
>
> > > > Cc: "Franck Houssen" < [email protected] >, "PETSc" <
> > > > [email protected] >, "PETSc" < [email protected] >
> > >
> >
>
> > > > Envoyé: Dimanche 21 Mai 2017 23:02:37
> > >
> >
>
> > > > Objet: Re: [petsc-dev] Using PETSc MatIS, how to matmult a global IS
> > > > matrix
> > > > and a global vector ?
> > >
> >
>
> > > > Franck,
> > >
> >
>
> > > > PETSc takes care of doing the matrix-vector multiplication properly
> > > > using
> > > > MatIS. As Matt said, the layout of the vectors is the usual parallel
> > > > layout.
> > >
> >
>
> > > > The local sizes of the MatIS matrix (i.e. the local size of the left
> > > > and
> > > > right vectors used in MatMult) are not the sizes of the local subdomain
> > > > matrices in MatIS.
> > >
> >
>
> > > > > On May 21, 2017, at 6:47 PM, Matthew Knepley < [email protected] >
> > > > > wrote:
> > > >
> > >
> >
>
> > > > > On Sun, May 21, 2017 at 11:26 AM, Franck Houssen <
> > > > > [email protected]
> > > > > >
> > > > > wrote:
> > > >
> > >
> >
>
> > > > > > Using PETSc MatIS, how to matmult a global IS matrix and a global
> > > > > > vector
> > > > > > ?
> > > > > > Example is attached : I don't get what I expect that is a vector
> > > > > > such
> > > > > > that
> > > > > > proc0 = [1, 2] and proc1 = [2, 1]
> > > > >
> > > >
> > >
> >
>
> > > > > 1) I think the global size of your matrix is wrong. You seem to want
> > > > > 3,
> > > > > not
> > > > > 4
> > > >
> > >
> >
>
> > > > > 2) Global vectors have a non-overlapping row partition. You might be
> > > > > thinking
> > > > > of local vectors
> > > >
> > >
> >
>
> > > > > Thanks,
> > > >
> > >
> >
>
> > > > > Matt
> > > >
> > >
> >
>
> > > > > > Franck
> > > > >
> > > >
> > >
> >
>
> > > > > --
> > > >
> > >
> >
>
> > > > > What most experimenters take for granted before they begin their
> > > > > experiments
> > > > > is infinitely more interesting than any results to which their
> > > > > experiments
> > > > > lead.
> > > >
> > >
> >
>
> > > > > -- Norbert Wiener
> > > >
> > >
> >
>
> > > > > http://www.caam.rice.edu/~mk51/
> > > >
> > >
> >
>
> > --
>
> > What most experimenters take for granted before they begin their
> > experiments
> > is infinitely more interesting than any results to which their experiments
> > lead.
>
> > -- Norbert Wiener
>
> > http://www.caam.rice.edu/~mk51/
>
// Using PETSc MatIS, how to matmult a global IS matrix and a global vector ?
//
// A 3x3 global matrix is built (diag: 1, 2, 1): it's made of 2 overlapping 2x2 local matrix (diag: 1, 1).
//
// How to multiply this matrix with a global vector (filled with 1.) ?
//
// ~> g++ -o matISProdMatVec.exe matISProdMatVec.cpp -lpetsc -lm; mpirun -n 2 matISProdMatVec.exe
// ~> mpirun -n 2 matISProdMatVec.exe
#include <iostream>
#include "petsc.h"
int main(int argc,char **argv) {
PetscInitialize(&argc, &argv, NULL, NULL);
int size = 0; MPI_Comm_size(MPI_COMM_WORLD, &size); if (size != 2) return 1;
int rank = 0; MPI_Comm_rank(MPI_COMM_WORLD, &rank);
PetscInt localSize = 2, globalSize = 3;
PetscInt localIdx[2] = {0, 0};
if (rank == 0) {localIdx[0] = 0; localIdx[1] = 1;}
else {localIdx[0] = 1; localIdx[1] = 2;}
ISLocalToGlobalMapping map;
ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, localSize, localIdx, PETSC_COPY_VALUES, &map);
Mat A;
// MatIS means that the matrix is not assembled. The easiest way to think of this is that processes do not have
// to hold full rows. One process can hold part of row i, and another processes can hold another part. However, there are still
// the same number of global rows. The local size here is not the size of the local IS block, since that is a property only of MatIS. It is the
// size of the local piece of the vector you multiply. This allows PETSc to understand the parallel layout of the Vec, and how it
// matched the Mat.
// Conclusion: it's crucial to use PETSC_DECIDE in place of localSize in your call to MatCreateIS.
MatCreateIS(PETSC_COMM_WORLD, 1, PETSC_DECIDE, PETSC_DECIDE, globalSize, globalSize, map, map, &A);
ISLocalToGlobalMappingDestroy(&map);
MatISSetPreallocation(A, localSize, NULL, localSize, NULL);
PetscScalar localVal[4] = {1., 0., 0., 1.};
MatSetValues(A, 2, localIdx, 2, localIdx, localVal, ADD_VALUES);
MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
if (rank == 0) std::cout << std::endl << "matrix A:" << std::endl << std::endl;
MatView(A, PETSC_VIEWER_STDOUT_WORLD); PetscViewerFlush(PETSC_VIEWER_STDOUT_WORLD); // Diag: 1, 2, 1
Vec x, y ;
MatCreateVecs(A, &x, &y);
VecSet(x, 1.);
if (rank == 0) std::cout << std::endl << "vector x:" << std::endl << std::endl;
VecView(x, PETSC_VIEWER_STDOUT_WORLD);
MatMult(A, x, y);
if (rank == 0) std::cout << std::endl << "vector y=Ax:" << std::endl << std::endl;
VecView(y, PETSC_VIEWER_STDOUT_WORLD);
PetscFinalize();
return 0;
}