Re: [petsc-users] Communication in parallel MatMatMult

2021-12-01 Thread Matthew Knepley
On Wed, Dec 1, 2021 at 9:32 AM Barry Smith wrote: > > PETSc uses Elemental to perform such operations. > > PetscErrorCode MatMatMultNumeric_Elemental(Mat A,Mat B,Mat C) > { > Mat_Elemental*a = (Mat_Elemental*)A->data; > Mat_Elemental*b = (Mat_Elemental*)B->data; > Mat_Elemental

Re: [petsc-users] Communication in parallel MatMatMult

2021-12-01 Thread Barry Smith
PETSc uses Elemental to perform such operations. PetscErrorCode MatMatMultNumeric_Elemental(Mat A,Mat B,Mat C) { Mat_Elemental*a = (Mat_Elemental*)A->data; Mat_Elemental*b = (Mat_Elemental*)B->data; Mat_Elemental*c = (Mat_Elemental*)C->data; PetscElemScalar one = 1,zero

[petsc-users] Communication in parallel MatMatMult

2021-12-01 Thread Hannes Brandt
Hello, I am interested in the communication scheme Petsc uses for the multiplication of dense, parallel distributed matrices in MatMatMult. Is it based on collective communication or on single calls to MPI_Send/Recv, and is it done in a blocking or a non-blocking way? How do you make sure