Re: [petsc-users] Performance of the Telescope Multigrid Preconditioner

2016-10-07 Thread Dave May
On Friday, 7 October 2016, frank wrote: > Dear all, > > Thank you so much for the advice. > > All setup is done in the first solve. > > >> ** The time for 1st solve does not scale. >> In practice, I am solving a variable coefficient Poisson equation. I >> need to build the

Re: [petsc-users] How to Get the last absolute residual that has been computed

2016-10-07 Thread Jed Brown
丁老师 writes: > Dear professor: > How to Get the last absolute residual that has been computed SNESGetFunctionNorm? signature.asc Description: PGP signature

Re: [petsc-users] Time cost by Vec Assembly

2016-10-07 Thread Jed Brown
Barry Smith writes: > There is still something wonky here, whether it is the MPI implementation > or how PETSc handles the assembly. Without any values that need to be > communicated it is unacceptably that these calls take so long. If we > understood __exactly__ why

Re: [petsc-users] How to Get the last absolute residual that has been computed

2016-10-07 Thread Barry Smith
KSPGetResidualNorm() if you wish the true (and not preconditioned residual) you must call KSPSetNormType() be the KSPSolve SNESGetFunctionNorm() > On Oct 7, 2016, at 10:41 PM, 丁老师 wrote: > > Dear professor: > How to Get the last absolute residual that has been

Re: [petsc-users] Time cost by Vec Assembly

2016-10-07 Thread Barry Smith
> On Oct 7, 2016, at 10:44 PM, Jed Brown wrote: > > Barry Smith writes: >>VecAssemblyBegin/End() does a couple of all reduces and then message >> passing (if values need to be moved) to get the values onto the correct >> processes. So these calls

[petsc-users] How to Get the last absolute residual that has been computed

2016-10-07 Thread 丁老师
Dear professor: How to Get the last absolute residual that has been computed

Re: [petsc-users] Time cost by Vec Assembly

2016-10-07 Thread Barry Smith
> On Oct 7, 2016, at 6:41 PM, frank wrote: > > Hello, > >>> Another thing, the vector assemble and scatter take more time as I >>> increased the cores#: >>> >>> cores# 4096 8192 >>> 16384 32768

[petsc-users] Time cost by Vec Assembly

2016-10-07 Thread frank
Hello, Another thing, the vector assemble and scatter take more time as I increased the cores#: cores# 4096 8192 16384 32768 65536 VecAssemblyBegin 2982.91E+002.87E+008.59E+002.75E+01

Re: [petsc-users] Performance of the Telescope Multigrid Preconditioner

2016-10-07 Thread Barry Smith
> On Oct 7, 2016, at 4:49 PM, frank wrote: > > Dear all, > > Thank you so much for the advice. >> All setup is done in the first solve. >> >> ** The time for 1st solve does not scale. >> In practice, I am solving a variable coefficient Poisson equation. I >> need to

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-07 Thread Barry Smith
Fande, If you can reproduce the problem with PETSc 3.7.4 please send us sample code that produces it so we can work with Sherry to get it fixed ASAP. Barry > On Oct 7, 2016, at 10:23 AM, Satish Balay wrote: > > On Fri, 7 Oct 2016, Kong, Fande wrote: > >> On

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-07 Thread Satish Balay
On Fri, 7 Oct 2016, Kong, Fande wrote: > On Fri, Oct 7, 2016 at 9:04 AM, Satish Balay wrote: > > > On Fri, 7 Oct 2016, Anton Popov wrote: > > > > > Hi guys, > > > > > > are there any news about fixing buggy behavior of SuperLU_DIST, exactly > > what > > > is described here: >

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-07 Thread Matthew Knepley
On Fri, Oct 7, 2016 at 10:16 AM, Kong, Fande wrote: > On Fri, Oct 7, 2016 at 9:04 AM, Satish Balay wrote: > >> On Fri, 7 Oct 2016, Anton Popov wrote: >> >> > Hi guys, >> > >> > are there any news about fixing buggy behavior of SuperLU_DIST, exactly >> what

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-07 Thread Kong, Fande
On Fri, Oct 7, 2016 at 9:04 AM, Satish Balay wrote: > On Fri, 7 Oct 2016, Anton Popov wrote: > > > Hi guys, > > > > are there any news about fixing buggy behavior of SuperLU_DIST, exactly > what > > is described here: > > > >

Re: [petsc-users] SuperLU_dist issue in 3.7.4

2016-10-07 Thread Satish Balay
On Fri, 7 Oct 2016, Anton Popov wrote: > Hi guys, > > are there any news about fixing buggy behavior of SuperLU_DIST, exactly what > is described here: > > http://lists.mcs.anl.gov/pipermail/petsc-users/2015-August/026802.html ? > > I'm using 3.7.4 and still get SEGV in pdgssvx routine.

[petsc-users] SuperLU_dist issue in 3.7.4

2016-10-07 Thread Anton Popov
Hi guys, are there any news about fixing buggy behavior of SuperLU_DIST, exactly what is described here: http://lists.mcs.anl.gov/pipermail/petsc-users/2015-August/026802.html ? I'm using 3.7.4 and still get SEGV in pdgssvx routine. Everything works fine with 3.5.4. Do I still have to

Re: [petsc-users] Performance of the Telescope Multigrid Preconditioner

2016-10-07 Thread Dave May
On 7 October 2016 at 02:05, Matthew Knepley wrote: > On Thu, Oct 6, 2016 at 7:33 PM, frank wrote: > >> Dear Dave, >> Follow your advice, I solve the identical equation twice and time two >> steps separately. The result is below: >> >> Test: 1024^3 grid