Re: [petsc-users] malconfigured gamg

2017-01-11 Thread Jed Brown
Barry Smith writes: > Ok, how about just checking there is at least one smoothing on the finest > level? This would catch most simple user errors, or do you know about oddball > cases with no smoothing on the finest level? No idea; what if it's inside a PCComposite

Re: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur

2017-01-11 Thread Barry Smith
> On Jan 11, 2017, at 9:21 PM, Matthew Knepley wrote: > > On Wed, Jan 11, 2017 at 8:31 PM, Barry Smith wrote: > >Thanks, this is very useful information. It means that > > 1) the approximate Sp is actually a very good approximation to the true Schur

Re: [petsc-users] malconfigured gamg

2017-01-11 Thread Barry Smith
Ok, how about just checking there is at least one smoothing on the finest level? This would catch most simple user errors, or do you know about oddball cases with no smoothing on the finest level? > On Jan 11, 2017, at 9:31 PM, Jed Brown wrote: > > Barry Smith

Re: [petsc-users] malconfigured gamg

2017-01-11 Thread Jed Brown
Barry Smith writes: >> On Jan 11, 2017, at 3:51 PM, Jed Brown wrote: >> >> Arne Morten Kvarving writes: >> >>> hi, >>> >>> first, this was an user error and i totally acknowledge this, but i >>> wonder if this might be

Re: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur

2017-01-11 Thread Barry Smith
That is disappointing, Please try using -pc_fieldsplit_schur_precondition full with the two cases of -fieldsplit_FE_split_pc_type gamg and -fieldsplit_FE_split_pc_type cholesky Barry > On Jan 11, 2017, at 8:49 PM, David Knezevic > wrote: > > OK,

Re: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur

2017-01-11 Thread Barry Smith
Thanks, this is very useful information. It means that 1) the approximate Sp is actually a very good approximation to the true Schur complement S, since using Sp^-1 to precondition S gives iteration counts from 8 to 13. 2) using ilu(0) as a preconditioner for Sp is not good, since

Re: [petsc-users] malconfigured gamg

2017-01-11 Thread Barry Smith
> On Jan 11, 2017, at 3:51 PM, Jed Brown wrote: > > Arne Morten Kvarving writes: > >> hi, >> >> first, this was an user error and i totally acknowledge this, but i >> wonder if this might be an oversight in your error checking: if you >>

Re: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur

2017-01-11 Thread Barry Smith
Can you please run with all the monitoring on? So we can see the convergence of all the inner solvers -fieldsplit_FE_split_ksp_monitor Then run again with -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky and send both sets of results Barry > On Jan 11,

Re: [petsc-users] malconfigured gamg

2017-01-11 Thread Jed Brown
Arne Morten Kvarving writes: > hi, > > first, this was an user error and i totally acknowledge this, but i > wonder if this might be an oversight in your error checking: if you > configure gamg with ilu/asm smoothing, and are stupid enough to have set > the

Re: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur

2017-01-11 Thread Dave May
It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). This is killing the performance. Are you sure that A11 is a good approximation to S? You might consider trying the selfp option

[petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur

2017-01-11 Thread David Knezevic
I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. So I did the

Re: [petsc-users] Call multiple error handler

2017-01-11 Thread Barry Smith
> On Jan 11, 2017, at 7:00 AM, Florian Lindner wrote: > > Hey, > > Am 10.01.2017 um 14:20 schrieb Barry Smith: >> >> I do not understand what you mean. I hope this is C. > > Yes, I'm talking about C(++). > > I'm using PETSc functions like that: > > ierr = ISDestroy();

[petsc-users] Visualization of uninterpolated DMPlex with hdf5 is broken

2017-01-11 Thread Sander Arens
Visualization of uninterpolated DMPlex with hdf5 currently does not work. I think the culprit is this line. Is this to avoid duplicating output

Re: [petsc-users] Call multiple error handler

2017-01-11 Thread Matthew Knepley
On Wed, Jan 11, 2017 at 7:00 AM, Florian Lindner wrote: > Hey, > > Am 10.01.2017 um 14:20 schrieb Barry Smith: > > > > I do not understand what you mean. I hope this is C. > > Yes, I'm talking about C(++). > > I'm using PETSc functions like that: > > ierr = ISDestroy();

Re: [petsc-users] Call multiple error handler

2017-01-11 Thread Florian Lindner
Hey, Am 10.01.2017 um 14:20 schrieb Barry Smith: > > I do not understand what you mean. I hope this is C. Yes, I'm talking about C(++). I'm using PETSc functions like that: ierr = ISDestroy(); CHKERRV(ierr); Unfortunately that is not alway possible, e.g. in this function: