Re: [petsc-users] (no subject)

2018-10-31 Thread Smith, Barry F. via petsc-users
https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > On Oct 31, 2018, at 9:18 PM, Wenjin Xing wrote: > > Hi Barry > > As you said, I have set the mat option to Aij. (MATAIJ = "aij" - A matrix > type to be used for sparse matrices. This matrix type is identical to > MATSEQAIJ

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Smith, Barry F. via petsc-users
> On Oct 31, 2018, at 5:39 PM, Appel, Thibaut via petsc-users > wrote: > > Well yes naturally for the residual but adding -ksp_true_residual just gives > > 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm > 3.583290589961e+00 ||r(i)||/||b|| 1.e+00 > 1 KSP

Re: [petsc-users] (no subject)

2018-10-31 Thread Smith, Barry F. via petsc-users
This option only works with AIJ matrices; you must be using either BAIJ or SBAIJ matrices? (or a shell matrix) Barry > On Oct 31, 2018, at 5:45 AM, Wenjin Xing via petsc-users > wrote: > > My issue is summarized in the picture and posted in the link >

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Appel, Thibaut via petsc-users
Well yes naturally for the residual but adding -ksp_true_residual just gives 0 KSP unpreconditioned resid norm 3.583290589961e+00 true resid norm 3.583290589961e+00 ||r(i)||/||b|| 1.e+00 1 KSP unpreconditioned resid norm 0.e+00 true resid norm 3.583290589961e+00

Re: [petsc-users] Convergence of AMG

2018-10-31 Thread Mark Adams via petsc-users
On Wed, Oct 31, 2018 at 3:43 PM Manav Bhatia wrote: > Here are the updates. I did not find the options to make much difference > in the results. > > I noticed this message in the GAMG output for cases 2, 3: HARD stop of > coarsening on level 3. Grid too small: 1 block nodes > Yea, this is

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Mark Adams via petsc-users
These are indefinite (bad) Helmholtz problems. Right? On Wed, Oct 31, 2018 at 2:38 PM Matthew Knepley wrote: > On Wed, Oct 31, 2018 at 2:13 PM Thibaut Appel > wrote: > >> Hi Mark, Matthew, >> >> Thanks for taking the time. >> >> 1) You're not suggesting having -fieldsplit_X_ksp_type *f*gmres

Re: [petsc-users] Two applications with PETSc

2018-10-31 Thread Guido Giuntoli via petsc-users
This is what I need ! Thank you Matt ! El El mié, 31 oct 2018 a las 19:53, Matthew Knepley escribió: > On Wed, Oct 31, 2018 at 1:34 PM Guido Giuntoli via petsc-users < > petsc-users@mcs.anl.gov> wrote: > >> Hi, I have two codes that use PETSc. The first one is parallel and uses >> MPI and the

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Thibaut Appel via petsc-users
Hi Mark, Matthew, Thanks for taking the time. 1) You're not suggesting having -fieldsplit_X_ksp_type *f*gmres for each field, are you? 2) No, the matrix *has* pressure in one of the fields. Here it's a 2D problem (but we're also doing 3D), the unknowns are (p,u,v) and those are my 3

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Mark Adams via petsc-users
Again, you probably want to avoid Cheby. with ‘-mg_levels_ksp_type richardson -mg_levels_pc_type sor’ with the proper prefix. I'm not sure about "-fieldsplit_pc_type gamg" GAMG should work on one block, and hence be a subpc. I'm not up on fieldsplit syntax. On Wed, Oct 31, 2018 at 9:22 AM

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Mark Adams via petsc-users
On Tue, Oct 30, 2018 at 5:23 PM Appel, Thibaut via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear users, > > Following a suggestion from Matthew Knepley I’ve been trying to apply > fieldsplit/gamg for my set of PDEs but I’m still encountering issues > despite various tests. pc_gamg simply

Re: [petsc-users] DIVERGED_NANORING with PC GAMG

2018-10-31 Thread Thibaut Appel via petsc-users
Hi Matthew, Which database option are you referring to? I tried to add -fieldsplit_mg_levels_ksp_type gmres (and -fieldsplit_mg_levels_ksp_max_it 4 for another run) to my options (cf. below) which starts the iterations but it takes 1 hour for PETSc to do 13 of them so it must be wrong.

[petsc-users] (no subject)

2018-10-31 Thread Wenjin Xing via petsc-users
My issue is summarized in the picture and posted in the link https://scicomp.stackexchange.com/questions/30458/what-does-the-error-this-matrix-type-does-not-have-a-find-zero-diagonals-define?noredirect=1#comment56074_30458 [cid:image001.png@01D4715E.DAED5B40] Kind regards Wenjin

Re: [petsc-users] Segmentation violation

2018-10-31 Thread Santiago Andres Triana via petsc-users
Hi Hong, You can find the matrices here: https://www.dropbox.com/s/ejpa9owkv8tjnwi/A.petsc?dl=0 https://www.dropbox.com/s/urjtxaezl0cv3om/B.petsc?dl=0 Changing the target value leads to the same error. What is strange is that this works without a problem on two other machines. But in my main