Re: [petsc-users] Strange mpi timing and CPU load when -np > 2

2022-09-26 Thread Barry Smith
It is important to check out https://petsc.org/main/faq/?highlight=faq#why-is-my-parallel-solver-slower-than-my-sequential-solver-or-i-have-poor-speed-up In

Re: [petsc-users] Strange mpi timing and CPU load when -np > 2

2022-09-26 Thread Matthew Knepley
On Mon, Sep 26, 2022 at 12:40 PM Duan Junming via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear all, > > I am using PETSc 3.17.4 on a Linux server, compiling > with --download-exodus --download-hdf5 --download-openmpi > --download-triangle --with-fc=0 --with-debugging=0 >

[petsc-users] Strange mpi timing and CPU load when -np > 2

2022-09-26 Thread Duan Junming via petsc-users
Dear all, I am using PETSc 3.17.4 on a Linux server, compiling with --download-exodus --download-hdf5 --download-openmpi --download-triangle --with-fc=0 --with-debugging=0 PETSC_ARCH=arch-linux-c-opt COPTFLAGS="-g -O3" CXXOPTFLAGS="-g -O3". The strange thing is when I run my code with mpirun