Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-17 Thread Alexander Lindsay
Good to know. I may take a shot at it depending on need and time! Opened
https://gitlab.com/petsc/petsc/-/issues/1362 for doing so

Alex

On Sun, Apr 16, 2023 at 9:27 PM Pierre Jolivet 
wrote:

>
> On 17 Apr 2023, at 1:10 AM, Alexander Lindsay 
> wrote:
>
> Are there any plans to get the missing hook into PETSc for AIR? Just
> curious if there’s an issue I can subscribe to or anything.
>
>
> Not that I know of, but it would make for a nice contribution if you feel
> like creating a PR.
>
> Thanks,
> Pierre
>
> (Independently I’m excited to test HPDDM out tomorrow)
>
> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet 
> wrote:
>
> 
>
> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay 
> wrote:
>
> Pierre,
>
> This is very helpful information. Thank you. Yes I would appreciate those
> command line options if you’re willing to share!
>
>
> No problem, I’ll get in touch with you in private first, because it may
> require some extra work (need a couple of extra options in PETSc
> ./configure), and this is not very related to the problem at hand, so best
> not to spam the mailing list.
>
> Thanks,
> Pierre
>
> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet 
> wrote:
>
> 
>
> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay 
> wrote:
>
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
> numbers. My options table
>
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
>
> works wonderfully for a low Reynolds number of 2.2. The solver performance
> crushes LU as I scale up the problem. However, not surprisingly this
> options table struggles when I bump the Reynolds number to 220. I've read
> that use of AIR (approximate ideal restriction) can improve performance for
> advection dominated problems. I've tried
> setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion
> problem and the option works fine. However, when applying it to my
> field-split preconditioned Navier-Stokes system, I get immediate
> non-convergence:
>
>  0 Nonlinear |R| = 1.033077e+03
>   0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
>
> Does anyone have an idea as to why this might be happening?
>
>
> Do not use this option, even when not part of PCFIELDSPLIT.
> There is some missing plumbing in PETSc which makes it unusable, see Ben’s
> comment here
> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
> In fact, it’s quite easy to make HYPRE generate NaN with a very simple
> stabilized convection—diffusion problem near the pure convection limit
> (something that ℓAIR is supposed to handle).
> Even worse, you can make HYPRE fill your terminal with printf-style
> debugging messages
> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
>  with
> this option turned on.
> As a result, I have been unable to reproduce any of the ℓAIR results.
> This also explains why I have been using plain BoomerAMG instead of ℓAIR
> for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if
> you would like to try the PC we are using, I could send you the command
> line options).
>
> Thanks,
> Pierre
>
> If not, I'd take a suggestion on where to set a breakpoint to start my own
> investigation. Alternatively, I welcome other preconditioning suggestions
> for an advection dominated problem.
>
> Alex
>
>
>
>
>


Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-16 Thread Pierre Jolivet

> On 17 Apr 2023, at 1:10 AM, Alexander Lindsay  
> wrote:
> 
> Are there any plans to get the missing hook into PETSc for AIR? Just curious 
> if there’s an issue I can subscribe to or anything.

Not that I know of, but it would make for a nice contribution if you feel like 
creating a PR.

Thanks,
Pierre 

> (Independently I’m excited to test HPDDM out tomorrow)
> 
>> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet  wrote:
>> 
>> 
>>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay  
>>> wrote:
>>> 
>>> Pierre,
>>> 
>>> This is very helpful information. Thank you. Yes I would appreciate those 
>>> command line options if you’re willing to share!
>> 
>> No problem, I’ll get in touch with you in private first, because it may 
>> require some extra work (need a couple of extra options in PETSc 
>> ./configure), and this is not very related to the problem at hand, so best 
>> not to spam the mailing list.
>> 
>> Thanks,
>> Pierre
>> 
 On Apr 13, 2023, at 9:54 PM, Pierre Jolivet  wrote:
 
 
 
> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
> wrote:
> 
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds 
> numbers. My options table
> 
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
> 
> works wonderfully for a low Reynolds number of 2.2. The solver 
> performance crushes LU as I scale up the problem. However, not 
> surprisingly this options table struggles when I bump the Reynolds number 
> to 220. I've read that use of AIR (approximate ideal restriction) can 
> improve performance for advection dominated problems. I've tried setting 
> -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and 
> the option works fine. However, when applying it to my field-split 
> preconditioned Navier-Stokes system, I get immediate non-convergence:
> 
>  0 Nonlinear |R| = 1.033077e+03
>   0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
> 
> Does anyone have an idea as to why this might be happening?
 
 Do not use this option, even when not part of PCFIELDSPLIT.
 There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
 comment here 
 https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
 In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
 stabilized convection—diffusion problem near the pure convection limit 
 (something that ℓAIR is supposed to handle).
 Even worse, you can make HYPRE fill your terminal with printf-style 
 debugging messages 
 https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
  with this option turned on.
 As a result, I have been unable to reproduce any of the ℓAIR results.
 This also explains why I have been using plain BoomerAMG instead of ℓAIR 
 for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if 
 you would like to try the PC we are using, I could send you the command 
 line options).
 
 Thanks,
 Pierre
 
> If not, I'd take a suggestion on where to set a breakpoint to start my 
> own investigation. Alternatively, I welcome other preconditioning 
> suggestions for an advection dominated problem.
> 
> Alex
 
>> 



Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-16 Thread Alexander Lindsay
Are there any plans to get the missing hook into PETSc for AIR? Just curious if 
there’s an issue I can subscribe to or anything.

(Independently I’m excited to test HPDDM out tomorrow)

> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet  wrote:
> 
> 
>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay  
>> wrote:
>> 
>> Pierre,
>> 
>> This is very helpful information. Thank you. Yes I would appreciate those 
>> command line options if you’re willing to share!
> 
> No problem, I’ll get in touch with you in private first, because it may 
> require some extra work (need a couple of extra options in PETSc 
> ./configure), and this is not very related to the problem at hand, so best 
> not to spam the mailing list.
> 
> Thanks,
> Pierre
> 
 On Apr 13, 2023, at 9:54 PM, Pierre Jolivet  wrote:
 
>>> 
>>> 
 On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
 wrote:
 
 Hi, I'm trying to solve steady Navier-Stokes for different Reynolds 
 numbers. My options table
 
 -dm_moose_fieldsplit_names u,p
 -dm_moose_nfieldsplits 2
 -fieldsplit_p_dm_moose_vars pressure
 -fieldsplit_p_ksp_type preonly
 -fieldsplit_p_pc_type jacobi
 -fieldsplit_u_dm_moose_vars vel_x,vel_y
 -fieldsplit_u_ksp_type preonly
 -fieldsplit_u_pc_hypre_type boomeramg
 -fieldsplit_u_pc_type hypre
 -pc_fieldsplit_schur_fact_type full
 -pc_fieldsplit_schur_precondition selfp
 -pc_fieldsplit_type schur
 -pc_type fieldsplit
 
 works wonderfully for a low Reynolds number of 2.2. The solver performance 
 crushes LU as I scale up the problem. However, not surprisingly this 
 options table struggles when I bump the Reynolds number to 220. I've read 
 that use of AIR (approximate ideal restriction) can improve performance 
 for advection dominated problems. I've tried setting 
 -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and 
 the option works fine. However, when applying it to my field-split 
 preconditioned Navier-Stokes system, I get immediate non-convergence:
 
  0 Nonlinear |R| = 1.033077e+03
   0 Linear |R| = 1.033077e+03
   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
 Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
 
 Does anyone have an idea as to why this might be happening?
>>> 
>>> Do not use this option, even when not part of PCFIELDSPLIT.
>>> There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
>>> comment here 
>>> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
>>> In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
>>> stabilized convection—diffusion problem near the pure convection limit 
>>> (something that ℓAIR is supposed to handle).
>>> Even worse, you can make HYPRE fill your terminal with printf-style 
>>> debugging messages 
>>> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
>>>  with this option turned on.
>>> As a result, I have been unable to reproduce any of the ℓAIR results.
>>> This also explains why I have been using plain BoomerAMG instead of ℓAIR 
>>> for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if 
>>> you would like to try the PC we are using, I could send you the command 
>>> line options).
>>> 
>>> Thanks,
>>> Pierre
>>> 
 If not, I'd take a suggestion on where to set a breakpoint to start my own 
 investigation. Alternatively, I welcome other preconditioning suggestions 
 for an advection dominated problem.
 
 Alex
>>> 
> 


Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Pierre Jolivet

> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay  
> wrote:
> 
> Pierre,
> 
> This is very helpful information. Thank you. Yes I would appreciate those 
> command line options if you’re willing to share!

No problem, I’ll get in touch with you in private first, because it may require 
some extra work (need a couple of extra options in PETSc ./configure), and this 
is not very related to the problem at hand, so best not to spam the mailing 
list.

Thanks,
Pierre

>> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet  wrote:
>> 
>> 
>> 
>>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
>>> wrote:
>>> 
>>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds 
>>> numbers. My options table
>>> 
>>> -dm_moose_fieldsplit_names u,p
>>> -dm_moose_nfieldsplits 2
>>> -fieldsplit_p_dm_moose_vars pressure
>>> -fieldsplit_p_ksp_type preonly
>>> -fieldsplit_p_pc_type jacobi
>>> -fieldsplit_u_dm_moose_vars vel_x,vel_y
>>> -fieldsplit_u_ksp_type preonly
>>> -fieldsplit_u_pc_hypre_type boomeramg
>>> -fieldsplit_u_pc_type hypre
>>> -pc_fieldsplit_schur_fact_type full
>>> -pc_fieldsplit_schur_precondition selfp
>>> -pc_fieldsplit_type schur
>>> -pc_type fieldsplit
>>> 
>>> works wonderfully for a low Reynolds number of 2.2. The solver performance 
>>> crushes LU as I scale up the problem. However, not surprisingly this 
>>> options table struggles when I bump the Reynolds number to 220. I've read 
>>> that use of AIR (approximate ideal restriction) can improve performance for 
>>> advection dominated problems. I've tried setting 
>>> -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and 
>>> the option works fine. However, when applying it to my field-split 
>>> preconditioned Navier-Stokes system, I get immediate non-convergence:
>>> 
>>>  0 Nonlinear |R| = 1.033077e+03
>>>   0 Linear |R| = 1.033077e+03
>>>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
>>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
>>> 
>>> Does anyone have an idea as to why this might be happening?
>> 
>> Do not use this option, even when not part of PCFIELDSPLIT.
>> There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
>> comment here 
>> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
>> In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
>> stabilized convection—diffusion problem near the pure convection limit 
>> (something that ℓAIR is supposed to handle).
>> Even worse, you can make HYPRE fill your terminal with printf-style 
>> debugging messages 
>> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
>>  with this option turned on.
>> As a result, I have been unable to reproduce any of the ℓAIR results.
>> This also explains why I have been using plain BoomerAMG instead of ℓAIR for 
>> the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you 
>> would like to try the PC we are using, I could send you the command line 
>> options).
>> 
>> Thanks,
>> Pierre
>> 
>>> If not, I'd take a suggestion on where to set a breakpoint to start my own 
>>> investigation. Alternatively, I welcome other preconditioning suggestions 
>>> for an advection dominated problem.
>>> 
>>> Alex
>> 



Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Alexander Lindsay
Pierre,

This is very helpful information. Thank you. Yes I would appreciate those 
command line options if you’re willing to share!

> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet  wrote:
> 
> 
> 
>>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
>>> wrote:
>>> 
>>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds 
>>> numbers. My options table
>>> 
>>> -dm_moose_fieldsplit_names u,p
>>> -dm_moose_nfieldsplits 2
>>> -fieldsplit_p_dm_moose_vars pressure
>>> -fieldsplit_p_ksp_type preonly
>>> -fieldsplit_p_pc_type jacobi
>>> -fieldsplit_u_dm_moose_vars vel_x,vel_y
>>> -fieldsplit_u_ksp_type preonly
>>> -fieldsplit_u_pc_hypre_type boomeramg
>>> -fieldsplit_u_pc_type hypre
>>> -pc_fieldsplit_schur_fact_type full
>>> -pc_fieldsplit_schur_precondition selfp
>>> -pc_fieldsplit_type schur
>>> -pc_type fieldsplit
>>> 
>>> works wonderfully for a low Reynolds number of 2.2. The solver performance 
>>> crushes LU as I scale up the problem. However, not surprisingly this 
>>> options table struggles when I bump the Reynolds number to 220. I've read 
>>> that use of AIR (approximate ideal restriction) can improve performance for 
>>> advection dominated problems. I've tried setting 
>>> -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and 
>>> the option works fine. However, when applying it to my field-split 
>>> preconditioned Navier-Stokes system, I get immediate non-convergence:
>>> 
>>>  0 Nonlinear |R| = 1.033077e+03
>>>   0 Linear |R| = 1.033077e+03
>>>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
>>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
>>> 
>>> Does anyone have an idea as to why this might be happening?
>> 
>> Do not use this option, even when not part of PCFIELDSPLIT.
>> There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
>> comment here 
>> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
>> In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
>> stabilized convection—diffusion problem near the pure convection limit 
>> (something that ℓAIR is supposed to handle).
>> Even worse, you can make HYPRE fill your terminal with printf-style 
>> debugging messages 
>> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
>>  with this option turned on.
>> As a result, I have been unable to reproduce any of the ℓAIR results.
>> This also explains why I have been using plain BoomerAMG instead of ℓAIR for 
>> the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you 
>> would like to try the PC we are using, I could send you the command line 
>> options).
>> 
>> Thanks,
>> Pierre
>> 
>> If not, I'd take a suggestion on where to set a breakpoint to start my own 
>> investigation. Alternatively, I welcome other preconditioning suggestions 
>> for an advection dominated problem.
>> 
>> Alex
> 


Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Pierre Jolivet


> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
> wrote:
> 
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. 
> My options table
> 
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
> 
> works wonderfully for a low Reynolds number of 2.2. The solver performance 
> crushes LU as I scale up the problem. However, not surprisingly this options 
> table struggles when I bump the Reynolds number to 220. I've read that use of 
> AIR (approximate ideal restriction) can improve performance for advection 
> dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 
> for a simple diffusion problem and the option works fine. However, when 
> applying it to my field-split preconditioned Navier-Stokes system, I get 
> immediate non-convergence:
> 
>  0 Nonlinear |R| = 1.033077e+03
>   0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
> 
> Does anyone have an idea as to why this might be happening?

Do not use this option, even when not part of PCFIELDSPLIT.
There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
comment here 
https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
stabilized convection—diffusion problem near the pure convection limit 
(something that ℓAIR is supposed to handle).
Even worse, you can make HYPRE fill your terminal with printf-style debugging 
messages 
https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
 with this option turned on.
As a result, I have been unable to reproduce any of the ℓAIR results.
This also explains why I have been using plain BoomerAMG instead of ℓAIR for 
the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you would 
like to try the PC we are using, I could send you the command line options).

Thanks,
Pierre

> If not, I'd take a suggestion on where to set a breakpoint to start my own 
> investigation. Alternatively, I welcome other preconditioning suggestions for 
> an advection dominated problem.
> 
> Alex



Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Alexander Lindsay
OpenMP is definitely linked in and appears in the stacktrace but I haven’t asked for any threads (to my knowledge).On Apr 13, 2023, at 7:03 PM, Mark Adams  wrote:Are you using OpenMP? ("OMP").If so try without it.On Thu, Apr 13, 2023 at 5:07 PM Alexander Lindsay  wrote:Here's the result.    0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm 1.033076851740e+03 ||r(i)||/||b|| 1.e+00      Residual norms for fieldsplit_u_ solve.      0 KSP Residual norm           -nan       Residual norms for fieldsplit_p_ solve.      0 KSP Residual norm           -nan       Residual norms for fieldsplit_u_ solve.      0 KSP Residual norm           -nan       1 KSP Residual norm           -nan       Residual norms for fieldsplit_u_ solve.      0 KSP Residual norm           -nan   Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0                 PC failed due to SUBPC_ERRORI probably should have read the FAQ on `-fp_trap` before sending my first email. Working with this stack trace (gdb) bt#0  0x7fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at par_csr_matop.c:1124#1  0x74982a16 in GOMP_parallel () from /lib/x86_64-linux-gnu/libgomp.so.1#2  0x7fffe83abfd1 in hypre_ParMatmul (A=, B=B@entry=0x5da2ffa0) at par_csr_matop.c:967#3  0x7fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=, A=, f=,     u=) at par_amg_setup.c:2790#4  0x7fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=, A=, b=,     x=) at HYPRE_parcsr_amg.c:47#5  0x7fffe940d33c in PCSetUp_HYPRE (pc=)    at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418#6  0x7fffe9413d87 in PCSetUp (pc=0x5d5ef390)    at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017#7  0x7fffe94f856b in KSPSetUp (ksp=ksp@entry=0x5d5eecb0)    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408#8  0x7fffe94fa6f4 in KSPSolve_Private (ksp=ksp@entry=0x5d5eecb0, b=0x5d619730, x=)    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852#9  0x7fffe94fd8b1 in KSPSolve (ksp=ksp@entry=0x5d5eecb0, b=, x=)    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086#10 0x7fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x55bef790, x=0x56d5a510, y=0x56d59e30)    at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185#11 0x7fffe9414484 in PCApply (pc=pc@entry=0x55bef790, x=x@entry=0x56d5a510, y=y@entry=0x56d59e30)    at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445#12 0x7fffe9415ad7 in PCApplyBAorAB (pc=0x55bef790, side=PC_RIGHT, x=0x56d5a510,     y=y@entry=0x56e922a0, work=0x56d59e30)    at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727#13 0x7fffe9451fcd in KSP_PCApplyBAorAB (w=, y=0x56e922a0, x=,     ksp=0x56068fc0) at /home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421#14 KSPGMRESCycle (itcount=itcount@entry=0x7fffcca0, ksp=ksp@entry=0x56068fc0)    at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162#15 0x7fffe94536f9 in KSPSolve_GMRES (ksp=0x56068fc0)    at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247#16 0x7fffe94fb1c4 in KSPSolve_Private (ksp=0x56068fc0, b=b@entry=0x5568e510, x=,     x@entry=0x5607cce0) at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914#17 0x7fffe94fd8b1 in KSPSolve (ksp=, b=b@entry=0x5568e510, x=x@entry=0x5607cce0)    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086#18 0x7fffe9582850 in SNESSolve_NEWTONLS (snes=0x56065610)    at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225#19 0x7fffe959c7ee in SNESSolve (snes=0x56065610, b=0x0, x=)    at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809On Thu, Apr 13, 2023 at 1:54 PM Barry Smith  wrote:  It would be useful to see the convergences inside the linear solve so perhaps start with -ksp_monitor_true_residual -fieldsplit_u_ksp_type richardson    (this is to allow the monitor below to work)-fieldsplit_u_ksp_max_its 1 -fieldsplit_u_ksp_monitorPerhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff than me.We should have a convenience option like -pc_fieldsplit_schur_monitor similar to the -pc_fieldsplit_gkb_monitorOn Apr 13, 2023, at 4:33 PM, Alexander Lindsay  wrote:Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table-dm_moose_fieldsplit_names u,p-dm_moose_nfieldsplits 2-fieldsplit_p_dm_moose_vars pressure-fieldsplit_p_ksp_type preonly-fieldsplit_p_pc_type jacobi-fieldsplit_u_dm_moose_vars vel_x,vel_y-fieldsplit_u_ksp_type preonly-fieldsplit_u_pc_hypre_type boomeramg-fieldsplit_u_pc_type hypre-pc_fieldsplit_schur_fact_type full-pc_fieldsplit_schur_precondition selfp-pc_fieldsplit_type schur-pc_type 

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Mark Adams
Are you using OpenMP? ("OMP").
If so try without it.

On Thu, Apr 13, 2023 at 5:07 PM Alexander Lindsay 
wrote:

> Here's the result.
>
> 0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm
> 1.033076851740e+03 ||r(i)||/||b|| 1.e+00
>   Residual norms for fieldsplit_u_ solve.
>   0 KSP Residual norm   -nan
>   Residual norms for fieldsplit_p_ solve.
>   0 KSP Residual norm   -nan
>   Residual norms for fieldsplit_u_ solve.
>   0 KSP Residual norm   -nan
>   1 KSP Residual norm   -nan
>   Residual norms for fieldsplit_u_ solve.
>   0 KSP Residual norm   -nan
>   Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>  PC failed due to SUBPC_ERROR
>
> I probably should have read the FAQ on `-fp_trap` before sending my first
> email.
>
> Working with this stack trace
>
>  (gdb) bt
> #0  0x7fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at
> par_csr_matop.c:1124
> #1  0x74982a16 in GOMP_parallel () from
> /lib/x86_64-linux-gnu/libgomp.so.1
> #2  0x7fffe83abfd1 in hypre_ParMatmul (A=, 
> B=B@entry=0x5da2ffa0)
> at par_csr_matop.c:967
> #3  0x7fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=,
> A=, f=,
> u=) at par_amg_setup.c:2790
> #4  0x7fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=,
> A=, b=,
> x=) at HYPRE_parcsr_amg.c:47
> #5  0x7fffe940d33c in PCSetUp_HYPRE (pc=)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418
> #6  0x7fffe9413d87 in PCSetUp (pc=0x5d5ef390)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017
> #7  0x7fffe94f856b in KSPSetUp (ksp=ksp@entry=0x5d5eecb0)
> at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408
> #8  0x7fffe94fa6f4 in KSPSolve_Private (ksp=ksp@entry=0x5d5eecb0,
> b=0x5d619730, x=)
> at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852
> #9  0x7fffe94fd8b1 in KSPSolve (ksp=ksp@entry=0x5d5eecb0,
> b=, x=)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
> #10 0x7fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x55bef790,
> x=0x56d5a510, y=0x56d59e30)
> at
> /home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185
> #11 0x7fffe9414484 in PCApply (pc=pc@entry=0x55bef790, 
> x=x@entry=0x56d5a510,
> y=y@entry=0x56d59e30)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445
> #12 0x7fffe9415ad7 in PCApplyBAorAB (pc=0x55bef790, side=PC_RIGHT,
> x=0x56d5a510,
> y=y@entry=0x56e922a0, work=0x56d59e30)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727
> #13 0x7fffe9451fcd in KSP_PCApplyBAorAB (w=,
> y=0x56e922a0, x=,
> ksp=0x56068fc0) at
> /home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421
> #14 KSPGMRESCycle (itcount=itcount@entry=0x7fffcca0, ksp=ksp@entry
> =0x56068fc0)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162
> #15 0x7fffe94536f9 in KSPSolve_GMRES (ksp=0x56068fc0)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247
> #16 0x7fffe94fb1c4 in KSPSolve_Private (ksp=0x56068fc0, 
> b=b@entry=0x5568e510,
> x=,
> x@entry=0x5607cce0) at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914
> #17 0x7fffe94fd8b1 in KSPSolve (ksp=, 
> b=b@entry=0x5568e510,
> x=x@entry=0x5607cce0)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
> #18 0x7fffe9582850 in SNESSolve_NEWTONLS (snes=0x56065610)
> at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225
> #19 0x7fffe959c7ee in SNESSolve (snes=0x56065610, b=0x0,
> x=)
> at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809
>
> On Thu, Apr 13, 2023 at 1:54 PM Barry Smith  wrote:
>
>>
>>   It would be useful to see the convergences inside the linear solve so
>> perhaps start with
>>
>> -ksp_monitor_true_residual
>>
>> -fieldsplit_u_ksp_type richardson(this is to allow the monitor below
>> to work)
>> -fieldsplit_u_ksp_max_its 1
>> -fieldsplit_u_ksp_monitor
>>
>> Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff
>> than me.
>>
>> We should have a convenience option like -pc_fieldsplit_schur_monitor
>> similar to the -pc_fieldsplit_gkb_monitor
>>
>>
>>
>> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay 
>> wrote:
>>
>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
>> numbers. My options table
>>
>> -dm_moose_fieldsplit_names u,p
>> -dm_moose_nfieldsplits 2
>> -fieldsplit_p_dm_moose_vars pressure
>> -fieldsplit_p_ksp_type preonly
>> -fieldsplit_p_pc_type jacobi
>> -fieldsplit_u_dm_moose_vars vel_x,vel_y
>> -fieldsplit_u_ksp_type preonly
>> -fieldsplit_u_pc_hypre_type boomeramg
>> -fieldsplit_u_pc_type hypre
>> 

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Matthew Knepley
On Thu, Apr 13, 2023 at 5:07 PM Alexander Lindsay 
wrote:

> Here's the result.
>
> 0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm
> 1.033076851740e+03 ||r(i)||/||b|| 1.e+00
>   Residual norms for fieldsplit_u_ solve.
>   0 KSP Residual norm   -nan
>   Residual norms for fieldsplit_p_ solve.
>   0 KSP Residual norm   -nan
>   Residual norms for fieldsplit_u_ solve.
>   0 KSP Residual norm   -nan
>   1 KSP Residual norm   -nan
>   Residual norms for fieldsplit_u_ solve.
>   0 KSP Residual norm   -nan
>   Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>  PC failed due to SUBPC_ERROR
>
> I probably should have read the FAQ on `-fp_trap` before sending my first
> email.
>

I think this can be mailed to Hypre now. We do not interfere in
their SetUp, so I think it has to be on their side.

  Thanks,

 Matt


> Working with this stack trace
>
>  (gdb) bt
> #0  0x7fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at
> par_csr_matop.c:1124
> #1  0x74982a16 in GOMP_parallel () from
> /lib/x86_64-linux-gnu/libgomp.so.1
> #2  0x7fffe83abfd1 in hypre_ParMatmul (A=, 
> B=B@entry=0x5da2ffa0)
> at par_csr_matop.c:967
> #3  0x7fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=,
> A=, f=,
> u=) at par_amg_setup.c:2790
> #4  0x7fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=,
> A=, b=,
> x=) at HYPRE_parcsr_amg.c:47
> #5  0x7fffe940d33c in PCSetUp_HYPRE (pc=)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418
> #6  0x7fffe9413d87 in PCSetUp (pc=0x5d5ef390)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017
> #7  0x7fffe94f856b in KSPSetUp (ksp=ksp@entry=0x5d5eecb0)
> at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408
> #8  0x7fffe94fa6f4 in KSPSolve_Private (ksp=ksp@entry=0x5d5eecb0,
> b=0x5d619730, x=)
> at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852
> #9  0x7fffe94fd8b1 in KSPSolve (ksp=ksp@entry=0x5d5eecb0,
> b=, x=)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
> #10 0x7fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x55bef790,
> x=0x56d5a510, y=0x56d59e30)
> at
> /home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185
> #11 0x7fffe9414484 in PCApply (pc=pc@entry=0x55bef790, 
> x=x@entry=0x56d5a510,
> y=y@entry=0x56d59e30)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445
> #12 0x7fffe9415ad7 in PCApplyBAorAB (pc=0x55bef790, side=PC_RIGHT,
> x=0x56d5a510,
> y=y@entry=0x56e922a0, work=0x56d59e30)
> at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727
> #13 0x7fffe9451fcd in KSP_PCApplyBAorAB (w=,
> y=0x56e922a0, x=,
> ksp=0x56068fc0) at
> /home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421
> #14 KSPGMRESCycle (itcount=itcount@entry=0x7fffcca0, ksp=ksp@entry
> =0x56068fc0)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162
> #15 0x7fffe94536f9 in KSPSolve_GMRES (ksp=0x56068fc0)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247
> #16 0x7fffe94fb1c4 in KSPSolve_Private (ksp=0x56068fc0, 
> b=b@entry=0x5568e510,
> x=,
> x@entry=0x5607cce0) at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914
> #17 0x7fffe94fd8b1 in KSPSolve (ksp=, 
> b=b@entry=0x5568e510,
> x=x@entry=0x5607cce0)
> at
> /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
> #18 0x7fffe9582850 in SNESSolve_NEWTONLS (snes=0x56065610)
> at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225
> #19 0x7fffe959c7ee in SNESSolve (snes=0x56065610, b=0x0,
> x=)
> at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809
>
> On Thu, Apr 13, 2023 at 1:54 PM Barry Smith  wrote:
>
>>
>>   It would be useful to see the convergences inside the linear solve so
>> perhaps start with
>>
>> -ksp_monitor_true_residual
>>
>> -fieldsplit_u_ksp_type richardson(this is to allow the monitor below
>> to work)
>> -fieldsplit_u_ksp_max_its 1
>> -fieldsplit_u_ksp_monitor
>>
>> Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff
>> than me.
>>
>> We should have a convenience option like -pc_fieldsplit_schur_monitor
>> similar to the -pc_fieldsplit_gkb_monitor
>>
>>
>>
>> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay 
>> wrote:
>>
>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
>> numbers. My options table
>>
>> -dm_moose_fieldsplit_names u,p
>> -dm_moose_nfieldsplits 2
>> -fieldsplit_p_dm_moose_vars pressure
>> -fieldsplit_p_ksp_type preonly
>> -fieldsplit_p_pc_type jacobi
>> -fieldsplit_u_dm_moose_vars vel_x,vel_y
>> 

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Alexander Lindsay
Here's the result.

0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm
1.033076851740e+03 ||r(i)||/||b|| 1.e+00
  Residual norms for fieldsplit_u_ solve.
  0 KSP Residual norm   -nan
  Residual norms for fieldsplit_p_ solve.
  0 KSP Residual norm   -nan
  Residual norms for fieldsplit_u_ solve.
  0 KSP Residual norm   -nan
  1 KSP Residual norm   -nan
  Residual norms for fieldsplit_u_ solve.
  0 KSP Residual norm   -nan
  Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
 PC failed due to SUBPC_ERROR

I probably should have read the FAQ on `-fp_trap` before sending my first
email.

Working with this stack trace

 (gdb) bt
#0  0x7fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at
par_csr_matop.c:1124
#1  0x74982a16 in GOMP_parallel () from
/lib/x86_64-linux-gnu/libgomp.so.1
#2  0x7fffe83abfd1 in hypre_ParMatmul (A=,
B=B@entry=0x5da2ffa0)
at par_csr_matop.c:967
#3  0x7fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=,
A=, f=,
u=) at par_amg_setup.c:2790
#4  0x7fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=,
A=, b=,
x=) at HYPRE_parcsr_amg.c:47
#5  0x7fffe940d33c in PCSetUp_HYPRE (pc=)
at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418
#6  0x7fffe9413d87 in PCSetUp (pc=0x5d5ef390)
at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017
#7  0x7fffe94f856b in KSPSetUp (ksp=ksp@entry=0x5d5eecb0)
at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408
#8  0x7fffe94fa6f4 in KSPSolve_Private (ksp=ksp@entry=0x5d5eecb0,
b=0x5d619730, x=)
at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852
#9  0x7fffe94fd8b1 in KSPSolve (ksp=ksp@entry=0x5d5eecb0,
b=, x=)
at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
#10 0x7fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x55bef790,
x=0x56d5a510, y=0x56d59e30)
at
/home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185
#11 0x7fffe9414484 in PCApply (pc=pc@entry=0x55bef790,
x=x@entry=0x56d5a510,
y=y@entry=0x56d59e30)
at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445
#12 0x7fffe9415ad7 in PCApplyBAorAB (pc=0x55bef790, side=PC_RIGHT,
x=0x56d5a510,
y=y@entry=0x56e922a0, work=0x56d59e30)
at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727
#13 0x7fffe9451fcd in KSP_PCApplyBAorAB (w=,
y=0x56e922a0, x=,
ksp=0x56068fc0) at
/home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421
#14 KSPGMRESCycle (itcount=itcount@entry=0x7fffcca0, ksp=ksp@entry
=0x56068fc0)
at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162
#15 0x7fffe94536f9 in KSPSolve_GMRES (ksp=0x56068fc0)
at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247
#16 0x7fffe94fb1c4 in KSPSolve_Private (ksp=0x56068fc0,
b=b@entry=0x5568e510,
x=,
x@entry=0x5607cce0) at
/home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914
#17 0x7fffe94fd8b1 in KSPSolve (ksp=,
b=b@entry=0x5568e510,
x=x@entry=0x5607cce0)
at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
#18 0x7fffe9582850 in SNESSolve_NEWTONLS (snes=0x56065610)
at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225
#19 0x7fffe959c7ee in SNESSolve (snes=0x56065610, b=0x0,
x=)
at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809

On Thu, Apr 13, 2023 at 1:54 PM Barry Smith  wrote:

>
>   It would be useful to see the convergences inside the linear solve so
> perhaps start with
>
> -ksp_monitor_true_residual
>
> -fieldsplit_u_ksp_type richardson(this is to allow the monitor below
> to work)
> -fieldsplit_u_ksp_max_its 1
> -fieldsplit_u_ksp_monitor
>
> Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff
> than me.
>
> We should have a convenience option like -pc_fieldsplit_schur_monitor
> similar to the -pc_fieldsplit_gkb_monitor
>
>
>
> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay 
> wrote:
>
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
> numbers. My options table
>
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
>
> works wonderfully for a low Reynolds number of 2.2. The solver performance
> crushes LU as I scale up the problem. However, not surprisingly this
> options table struggles when I bump the 

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Barry Smith

  It would be useful to see the convergences inside the linear solve so perhaps 
start with 

-ksp_monitor_true_residual 

-fieldsplit_u_ksp_type richardson(this is to allow the monitor below to 
work)
-fieldsplit_u_ksp_max_its 1 
-fieldsplit_u_ksp_monitor

Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff than me.

We should have a convenience option like -pc_fieldsplit_schur_monitor similar 
to the -pc_fieldsplit_gkb_monitor



> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay  
> wrote:
> 
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. 
> My options table
> 
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
> 
> works wonderfully for a low Reynolds number of 2.2. The solver performance 
> crushes LU as I scale up the problem. However, not surprisingly this options 
> table struggles when I bump the Reynolds number to 220. I've read that use of 
> AIR (approximate ideal restriction) can improve performance for advection 
> dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 
> for a simple diffusion problem and the option works fine. However, when 
> applying it to my field-split preconditioned Navier-Stokes system, I get 
> immediate non-convergence:
> 
>  0 Nonlinear |R| = 1.033077e+03
>   0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
> 
> Does anyone have an idea as to why this might be happening? If not, I'd take 
> a suggestion on where to set a breakpoint to start my own investigation. 
> Alternatively, I welcome other preconditioning suggestions for an advection 
> dominated problem.
> 
> Alex



[petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Alexander Lindsay
Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
numbers. My options table

-dm_moose_fieldsplit_names u,p
-dm_moose_nfieldsplits 2
-fieldsplit_p_dm_moose_vars pressure
-fieldsplit_p_ksp_type preonly
-fieldsplit_p_pc_type jacobi
-fieldsplit_u_dm_moose_vars vel_x,vel_y
-fieldsplit_u_ksp_type preonly
-fieldsplit_u_pc_hypre_type boomeramg
-fieldsplit_u_pc_type hypre
-pc_fieldsplit_schur_fact_type full
-pc_fieldsplit_schur_precondition selfp
-pc_fieldsplit_type schur
-pc_type fieldsplit

works wonderfully for a low Reynolds number of 2.2. The solver performance
crushes LU as I scale up the problem. However, not surprisingly this
options table struggles when I bump the Reynolds number to 220. I've read
that use of AIR (approximate ideal restriction) can improve performance for
advection dominated problems. I've tried
setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion
problem and the option works fine. However, when applying it to my
field-split preconditioned Navier-Stokes system, I get immediate
non-convergence:

 0 Nonlinear |R| = 1.033077e+03
  0 Linear |R| = 1.033077e+03
  Linear solve did not converge due to DIVERGED_NANORINF iterations 0
Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0

Does anyone have an idea as to why this might be happening? If not, I'd
take a suggestion on where to set a breakpoint to start my own
investigation. Alternatively, I welcome other preconditioning suggestions
for an advection dominated problem.

Alex