Re: [petsc-users] Poor weak scaling when solving successive linearsystems

2018-06-15 Thread Michael Becker
of load imbalance. At coarser grids, it gets worse. But I need to confirm this caused the poor scaling and big vecscatter delays in the experiment.  Thanks. --Junchao Zhang On Tue, Jun 12, 2018 at 12:42 AM, Michael Becker <mailto:michael.bec...@physik.uni-giessen.de>> wrote:

Re: [petsc-users] Poor weak scaling when solving successive linearsystems

2018-06-11 Thread Michael Becker
Hello, any new insights yet? Michael Am 04.06.2018 um 21:56 schrieb Junchao Zhang: Miachael,  I can compile and run you test.  I am now profiling it. Thanks. --Junchao Zhang

Re: [petsc-users] Poor weak scaling when solving successive linearsystems

2018-06-04 Thread Michael Becker
ow). This should give us a better idea if your large VecScatter costs are from slow communication or if it catching some sort of load imbalance. --Junchao Zhang On Wed, May 30, 2018 at 3:27 AM, Michael Becker mailto:michael.bec...@physik.uni-giessen.de>> wrote: Barr

Re: [petsc-users] Poor weak scaling when solvingsuccessivelinearsystems

2018-05-30 Thread Michael Becker
ou distributed the 125 MPI ranks evenly. --Junchao Zhang On Tue, May 29, 2018 at 6:18 AM, Michael Becker <mailto:michael.bec...@physik.uni-giessen.de>> wrote: Hello again, here are the updated log_view files for 125 and 1000 processors. I ran both problems twice,

Re: [petsc-users] Poor weak scaling when solving successivelinearsystems

2018-05-29 Thread Michael Becker
Hello again, here are the updated log_view files for 125 and 1000 processors. I ran both problems twice, the first time with all processors per node allocated ("-1.txt"), the second with only half on twice the number of nodes ("-2.txt"). On May 24, 2018, at 12:2

Re: [petsc-users] Poor weak scaling when solving successivelinearsystems

2018-05-25 Thread Michael Becker
a bar chart of each event for the two cases to see which ones are taking more time and which are taking less (we cannot tell with the two logs you sent us since they are for different solvers.) On May 24, 2018, at 12:24 AM, Michael Becker <michael.bec...@physik.uni-giessen.de> wro

Re: [petsc-users] Poor weak scaling when solvingsuccessivelinearsystems

2018-05-24 Thread Michael Becker
k On Thu, May 24, 2018 at 5:10 AM, Michael Becker <michael.bec...@physik.uni-giessen.de <mailto:michael.bec...@physik.uni-giessen.de>> wrote: CG/GCR: I accidentally kept gcr in the batch file. That's still from when I was experimenting with the different methods. The

Re: [petsc-users] Poor weak scaling when solving successivelinearsystems

2018-05-24 Thread Michael Becker
);CHKERRQ(ierr); For the 125 case the arrays l_Nx, l_Ny, l_Nz have dimension 5 and every element has value 30. VecGetLocalSize() returns 27000 for every rank. Is there something I didn't consider? Michael Am 24.05.2018 um 09:39 schrieb Lawrence Mitchell: On 24 May 2018, at 06:24, Michael Becker

[petsc-users] Poor weak scaling when solving successive linear systems

2018-05-23 Thread Michael Becker
Hello, I added a PETSc solver class to our particle-in-cell simulation code and all calculations seem to be correct. However, some weak scaling tests I did are rather disappointing because the solver's runtime keeps increasing with system size although the number of cores are scaled up

Re: [petsc-users] Tuning performance for simple solver

2016-06-03 Thread Michael Becker
Michael Am 03.06.2016 um 14:32 schrieb Matthew Knepley: On Fri, Jun 3, 2016 at 5:56 AM, Dave May <dave.mayhe...@gmail.com <mailto:dave.mayhe...@gmail.com>> wrote: On 3 June 2016 at 11:37, Michael Becker <michael.bec...@physik.uni-giessen.de <mailto:michael.bec...

[petsc-users] Tuning performance for simple solver

2016-06-03 Thread Michael Becker
Dear all, I have a few questions regarding possible performance enhancements for the PETSc solver I included in my project. It's a particle-in-cell plasma simulation written in C++, where Poisson's equation needs to be solved repeatedly on every timestep. The simulation domain is discretized