On Wed 22. Jan 2020 at 16:12, Felix Huber <[email protected]> wrote:
> Hello, > > I currently investigate why our code does not show the expected weak > scaling behaviour in a CG solver. Can you please send representative log files which characterize the lack of scaling (include the full log_view)? Are you using a KSP/PC configuration which should weak scale? Thanks Dave Therefore I wanted to try out > different communication methods for the VecScatter in the matrix-vector > product. However, it seems like PETSc (version 3.7.6) always chooses > either MPI_Alltoallv or MPI_Alltoallw when I pass different options via > the PETSC_OPTIONS environment variable. Does anybody know, why this > doesn't work as I expected? > > The matrix is a MPIAIJ matrix and created by a finite element > discretization of a 3D Laplacian. Therefore it only communicates with > 'neighboring' MPI ranks. Not sure if it helps, but the code is run on a > Cray XC40. > > I tried the `ssend`, `rsend`, `sendfirst`, `reproduce` and no options > from > > https://www.mcs.anl.gov/petsc/petsc-3.7/docs/manualpages/Vec/VecScatterCreate.html > which all result in a MPI_Alltoallv. When combined with `nopack` the > communication uses MPI_Alltoallw. > > Best regards, > Felix > >
