Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-16 Thread Pierre Jolivet

> On 17 Apr 2023, at 1:10 AM, Alexander Lindsay  
> wrote:
> 
> Are there any plans to get the missing hook into PETSc for AIR? Just curious 
> if there’s an issue I can subscribe to or anything.

Not that I know of, but it would make for a nice contribution if you feel like 
creating a PR.

Thanks,
Pierre 

> (Independently I’m excited to test HPDDM out tomorrow)
> 
>> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet  wrote:
>> 
>> 
>>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay  
>>> wrote:
>>> 
>>> Pierre,
>>> 
>>> This is very helpful information. Thank you. Yes I would appreciate those 
>>> command line options if you’re willing to share!
>> 
>> No problem, I’ll get in touch with you in private first, because it may 
>> require some extra work (need a couple of extra options in PETSc 
>> ./configure), and this is not very related to the problem at hand, so best 
>> not to spam the mailing list.
>> 
>> Thanks,
>> Pierre
>> 
 On Apr 13, 2023, at 9:54 PM, Pierre Jolivet  wrote:
 
 
 
> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
> wrote:
> 
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds 
> numbers. My options table
> 
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
> 
> works wonderfully for a low Reynolds number of 2.2. The solver 
> performance crushes LU as I scale up the problem. However, not 
> surprisingly this options table struggles when I bump the Reynolds number 
> to 220. I've read that use of AIR (approximate ideal restriction) can 
> improve performance for advection dominated problems. I've tried setting 
> -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and 
> the option works fine. However, when applying it to my field-split 
> preconditioned Navier-Stokes system, I get immediate non-convergence:
> 
>  0 Nonlinear |R| = 1.033077e+03
>   0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
> 
> Does anyone have an idea as to why this might be happening?
 
 Do not use this option, even when not part of PCFIELDSPLIT.
 There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
 comment here 
 https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
 In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
 stabilized convection—diffusion problem near the pure convection limit 
 (something that ℓAIR is supposed to handle).
 Even worse, you can make HYPRE fill your terminal with printf-style 
 debugging messages 
 https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
  with this option turned on.
 As a result, I have been unable to reproduce any of the ℓAIR results.
 This also explains why I have been using plain BoomerAMG instead of ℓAIR 
 for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if 
 you would like to try the PC we are using, I could send you the command 
 line options).
 
 Thanks,
 Pierre
 
> If not, I'd take a suggestion on where to set a breakpoint to start my 
> own investigation. Alternatively, I welcome other preconditioning 
> suggestions for an advection dominated problem.
> 
> Alex
 
>> 



Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-16 Thread Alexander Lindsay
Are there any plans to get the missing hook into PETSc for AIR? Just curious if 
there’s an issue I can subscribe to or anything.

(Independently I’m excited to test HPDDM out tomorrow)

> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet  wrote:
> 
> 
>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay  
>> wrote:
>> 
>> Pierre,
>> 
>> This is very helpful information. Thank you. Yes I would appreciate those 
>> command line options if you’re willing to share!
> 
> No problem, I’ll get in touch with you in private first, because it may 
> require some extra work (need a couple of extra options in PETSc 
> ./configure), and this is not very related to the problem at hand, so best 
> not to spam the mailing list.
> 
> Thanks,
> Pierre
> 
 On Apr 13, 2023, at 9:54 PM, Pierre Jolivet  wrote:
 
>>> 
>>> 
 On 13 Apr 2023, at 10:33 PM, Alexander Lindsay  
 wrote:
 
 Hi, I'm trying to solve steady Navier-Stokes for different Reynolds 
 numbers. My options table
 
 -dm_moose_fieldsplit_names u,p
 -dm_moose_nfieldsplits 2
 -fieldsplit_p_dm_moose_vars pressure
 -fieldsplit_p_ksp_type preonly
 -fieldsplit_p_pc_type jacobi
 -fieldsplit_u_dm_moose_vars vel_x,vel_y
 -fieldsplit_u_ksp_type preonly
 -fieldsplit_u_pc_hypre_type boomeramg
 -fieldsplit_u_pc_type hypre
 -pc_fieldsplit_schur_fact_type full
 -pc_fieldsplit_schur_precondition selfp
 -pc_fieldsplit_type schur
 -pc_type fieldsplit
 
 works wonderfully for a low Reynolds number of 2.2. The solver performance 
 crushes LU as I scale up the problem. However, not surprisingly this 
 options table struggles when I bump the Reynolds number to 220. I've read 
 that use of AIR (approximate ideal restriction) can improve performance 
 for advection dominated problems. I've tried setting 
 -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and 
 the option works fine. However, when applying it to my field-split 
 preconditioned Navier-Stokes system, I get immediate non-convergence:
 
  0 Nonlinear |R| = 1.033077e+03
   0 Linear |R| = 1.033077e+03
   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
 Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
 
 Does anyone have an idea as to why this might be happening?
>>> 
>>> Do not use this option, even when not part of PCFIELDSPLIT.
>>> There is some missing plumbing in PETSc which makes it unusable, see Ben’s 
>>> comment here 
>>> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
>>> In fact, it’s quite easy to make HYPRE generate NaN with a very simple 
>>> stabilized convection—diffusion problem near the pure convection limit 
>>> (something that ℓAIR is supposed to handle).
>>> Even worse, you can make HYPRE fill your terminal with printf-style 
>>> debugging messages 
>>> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
>>>  with this option turned on.
>>> As a result, I have been unable to reproduce any of the ℓAIR results.
>>> This also explains why I have been using plain BoomerAMG instead of ℓAIR 
>>> for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if 
>>> you would like to try the PC we are using, I could send you the command 
>>> line options).
>>> 
>>> Thanks,
>>> Pierre
>>> 
 If not, I'd take a suggestion on where to set a breakpoint to start my own 
 investigation. Alternatively, I welcome other preconditioning suggestions 
 for an advection dominated problem.
 
 Alex
>>> 
> 


Re: [petsc-users] Fieldsplit with redistribute

2023-04-16 Thread Carl-Johan Thore via petsc-users
Ok I see, thanks! I'll try PCSetUp_Redistribute() then.

From: Barry Smith 
Sent: 16 April 2023 22:31:14
To: Carl-Johan Thore 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Fieldsplit with redistribute


   The manual page for ISEmbed is incomprehensible to me. Anyways no matter 
what, you need to know what degrees of freedom are removed by PCDistribute() in 
order to produce the reduced IS which is why I think you need information only 
available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() 
not PCApply_Redistribute())

  Barry


On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore  wrote:

Thanks for the quick reply Barry!
I have not tried the version with PCApply_Redistribute that you suggest, but I 
have a code that does roughly what you describe. It works when running on one 
rank, but fails on multiple ranks. I suspect the issue is with the use of 
ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since 
the index substitution it encodes is purely local" (admittedly I don't fully 
understand what that means). If you think using ISEmbed is not a good idea, 
I'll try PCApply_Redistribute()

From: Barry Smith 
Sent: 16 April 2023 21:11:18
To: Carl-Johan Thore 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Fieldsplit with redistribute


   There is no code to do this currently.

I would start by building your IS for each split before the PCRedistribute 
and then adding to the PCApply_Redistribute() code that "fixes" these IS by 
"removing" the entries of the IS associated with removed degrees of freedom and 
then shifting the entries indices of the IS by taking into account the removed 
indices. But you have probably already been trying this? It does require 
digging directly into the PCApply_Redistribute() to get the needed information 
(which degrees of freedom are removed by the redistribute code), plus it 
requires shifting the MPI rank ownership of the entries of the IS in the same 
way the MPI rank ownership of the degrees of freedom of the vector are moved.

   If you have some code that you think should be doing this but doesn't work 
feel free to send it to us and we may be able to fix it.

  Barry


> On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users 
>  wrote:
>
> Hello,
> I'm solving a blocksystem
> [A C;
> C' D],
> where D is not zero, using the PCFIELDSPLIT preconditioner and set the split 
> using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE 
> (which is attractive as I have many locked DOFs). I suspect something goes 
> wrong when constructing the IS for the split (I've tried various things using 
> the IS-routines). Can PETSc do this automatically? Or else, any hints?
> Kind regards,
> Carl-Johan




Re: [petsc-users] Fieldsplit with redistribute

2023-04-16 Thread Barry Smith

   The manual page for ISEmbed is incomprehensible to me. Anyways no matter 
what, you need to know what degrees of freedom are removed by PCDistribute() in 
order to produce the reduced IS which is why I think you need information only 
available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() 
not PCApply_Redistribute())

  Barry


> On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore  wrote:
> 
> Thanks for the quick reply Barry!
> I have not tried the version with PCApply_Redistribute that you suggest, but 
> I have a code that does roughly what you describe. It works when running on 
> one rank, but fails on multiple ranks. I suspect the issue is with the use of 
> ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since 
> the index substitution it encodes is purely local" (admittedly I don't fully 
> understand what that means). If you think using ISEmbed is not a good idea, 
> I'll try PCApply_Redistribute()
> From: Barry Smith 
> Sent: 16 April 2023 21:11:18
> To: Carl-Johan Thore 
> Cc: petsc-users@mcs.anl.gov 
> Subject: Re: [petsc-users] Fieldsplit with redistribute
>  
> 
>There is no code to do this currently. 
> 
> I would start by building your IS for each split before the 
> PCRedistribute and then adding to the PCApply_Redistribute() code that 
> "fixes" these IS by "removing" the entries of the IS associated with removed 
> degrees of freedom and then shifting the entries indices of the IS by taking 
> into account the removed indices. But you have probably already been trying 
> this? It does require digging directly into the PCApply_Redistribute() to get 
> the needed information (which degrees of freedom are removed by the 
> redistribute code), plus it requires shifting the MPI rank ownership of the 
> entries of the IS in the same way the MPI rank ownership of the degrees of 
> freedom of the vector are moved.
> 
>If you have some code that you think should be doing this but doesn't work 
> feel free to send it to us and we may be able to fix it.
> 
>   Barry
> 
> 
> > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users 
> >  wrote:
> > 
> > Hello,
> > I'm solving a blocksystem
> > [A C;
> > C' D],
> > where D is not zero, using the PCFIELDSPLIT preconditioner and set the 
> > split using PetscFieldSplitSetIS. This works very well until I try 
> > PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect 
> > something goes wrong when constructing the IS for the split (I've tried 
> > various things using the IS-routines). Can PETSc do this automatically? Or 
> > else, any hints?
> > Kind regards,
> > Carl-Johan
> 



Re: [petsc-users] Fieldsplit with redistribute

2023-04-16 Thread Carl-Johan Thore via petsc-users
Thanks for the quick reply Barry!
I have not tried the version with PCApply_Redistribute that you suggest, but I 
have a code that does roughly what you describe. It works when running on one 
rank, but fails on multiple ranks. I suspect the issue is with the use of 
ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since 
the index substitution it encodes is purely local" (admittedly I don't fully 
understand what that means). If you think using ISEmbed is not a good idea, 
I'll try PCApply_Redistribute()

From: Barry Smith 
Sent: 16 April 2023 21:11:18
To: Carl-Johan Thore 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Fieldsplit with redistribute


   There is no code to do this currently.

I would start by building your IS for each split before the PCRedistribute 
and then adding to the PCApply_Redistribute() code that "fixes" these IS by 
"removing" the entries of the IS associated with removed degrees of freedom and 
then shifting the entries indices of the IS by taking into account the removed 
indices. But you have probably already been trying this? It does require 
digging directly into the PCApply_Redistribute() to get the needed information 
(which degrees of freedom are removed by the redistribute code), plus it 
requires shifting the MPI rank ownership of the entries of the IS in the same 
way the MPI rank ownership of the degrees of freedom of the vector are moved.

   If you have some code that you think should be doing this but doesn't work 
feel free to send it to us and we may be able to fix it.

  Barry


> On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users 
>  wrote:
>
> Hello,
> I'm solving a blocksystem
> [A C;
> C' D],
> where D is not zero, using the PCFIELDSPLIT preconditioner and set the split 
> using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE 
> (which is attractive as I have many locked DOFs). I suspect something goes 
> wrong when constructing the IS for the split (I've tried various things using 
> the IS-routines). Can PETSc do this automatically? Or else, any hints?
> Kind regards,
> Carl-Johan



Re: [petsc-users] Fieldsplit with redistribute

2023-04-16 Thread Barry Smith


   There is no code to do this currently. 

I would start by building your IS for each split before the PCRedistribute 
and then adding to the PCApply_Redistribute() code that "fixes" these IS by 
"removing" the entries of the IS associated with removed degrees of freedom and 
then shifting the entries indices of the IS by taking into account the removed 
indices. But you have probably already been trying this? It does require 
digging directly into the PCApply_Redistribute() to get the needed information 
(which degrees of freedom are removed by the redistribute code), plus it 
requires shifting the MPI rank ownership of the entries of the IS in the same 
way the MPI rank ownership of the degrees of freedom of the vector are moved.

   If you have some code that you think should be doing this but doesn't work 
feel free to send it to us and we may be able to fix it.

  Barry


> On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users 
>  wrote:
> 
> Hello,
> I'm solving a blocksystem
> [A C;
> C' D],
> where D is not zero, using the PCFIELDSPLIT preconditioner and set the split 
> using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE 
> (which is attractive as I have many locked DOFs). I suspect something goes 
> wrong when constructing the IS for the split (I've tried various things using 
> the IS-routines). Can PETSc do this automatically? Or else, any hints?
> Kind regards,
> Carl-Johan



[petsc-users] Fieldsplit with redistribute

2023-04-16 Thread Carl-Johan Thore via petsc-users
Hello,
I'm solving a blocksystem
[A C;
C' D],
where D is not zero, using the PCFIELDSPLIT preconditioner and set the split 
using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE 
(which is attractive as I have many locked DOFs). I suspect something goes 
wrong when constructing the IS for the split (I've tried various things using 
the IS-routines). Can PETSc do this automatically? Or else, any hints?
Kind regards,
Carl-Johan