Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-27 Thread Mark Adams
I would not worry about the null space (if you have elasticity or the
equivalent use hypre for now) and the strength of connections, is not very
useful in my experience (it is confounded my high order and no one has
bothered to deploy a fancy strength of connections method in a library that
I know of). If you have anisotropies or material discontinuities, honestly
AMG does not do as well as advertised. That said, we could talk after you
get up and running.

If your problems are very hard then as Matt said, old fashioned geometric
MG using modern unstructured (FE) discretizations and mesh management, is
something to consider. PETSc has support for this and we are actively using
and developing support for this. Antony Jameson has been doing this for
decades here is an example of a new project doing something like this:
https://arxiv.org/abs/2307.04528. Tobin Isaac, in PETSc, and many others
have done things like this, but they tend to be customized for an
application whereas AMG strives to be general.

Mark

On Thu, Jul 27, 2023 at 1:10 AM Matthew Knepley  wrote:

> On Thu, Jul 27, 2023 at 12:48 AM Jed Brown  wrote:
>
>> AMG is subtle here. With AMG for systems, you typically feed it elements
>> of the near null space. In the case of (smoothed) aggregation, the coarse
>> space will have a regular block structure with block sizes equal to the
>> number of near-null vectors. You can use pc_fieldsplit options to select
>> which fields you want in each split.
>>
>> However, AMG also needs a strength of connection and if your system is so
>> weird you need to fieldsplit the smoothers (e.g., a saddle point problem or
>> a hyperbolic system) then it's likely that you'll also need a custom
>> strength of connection to obtain reasonable coarsening.
>>
>
> For this reason, sometimes GMG is easier for systems since you just
> rediscretize.
>
>   Thanks,
>
>  Matt
>
>
>> Barry Smith  writes:
>>
>> >   See the very end of the section
>> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how
>> to control the smoothers (and coarse grid solve) for multigrid in PETSc
>> including for algebraic multigrid.
>> >
>> >So, for example, -mg_levels_pc_type fieldsplit would be the starting
>> point. Depending on the block size of the matrices it may automatically do
>> simple splits, you can control the details  of the fieldsplit
>> preconditioner with -mg_levels_pc_fieldsplit_...  and the details for each
>> split with -mg_levels_fieldsplit_
>> >
>> >See src/ksp/ksp/tutorials/ex42.c for example, usage
>> >
>> >Feel free to ask more specific questions once you get started.
>> >
>> >> On Jul 26, 2023, at 9:47 PM, Michael Wick 
>> wrote:
>> >>
>> >> Hello PETSc team:
>> >>
>> >> I wonder if the current PETSc implementation supports using AMG
>> monolithically for a multi-field problem and using fieldsplit in the
>> smoother.
>> >>
>> >> Thank you very much,
>> >>
>> >> Mike
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Matthew Knepley
On Thu, Jul 27, 2023 at 12:48 AM Jed Brown  wrote:

> AMG is subtle here. With AMG for systems, you typically feed it elements
> of the near null space. In the case of (smoothed) aggregation, the coarse
> space will have a regular block structure with block sizes equal to the
> number of near-null vectors. You can use pc_fieldsplit options to select
> which fields you want in each split.
>
> However, AMG also needs a strength of connection and if your system is so
> weird you need to fieldsplit the smoothers (e.g., a saddle point problem or
> a hyperbolic system) then it's likely that you'll also need a custom
> strength of connection to obtain reasonable coarsening.
>

For this reason, sometimes GMG is easier for systems since you just
rediscretize.

  Thanks,

 Matt


> Barry Smith  writes:
>
> >   See the very end of the section
> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how to
> control the smoothers (and coarse grid solve) for multigrid in PETSc
> including for algebraic multigrid.
> >
> >So, for example, -mg_levels_pc_type fieldsplit would be the starting
> point. Depending on the block size of the matrices it may automatically do
> simple splits, you can control the details  of the fieldsplit
> preconditioner with -mg_levels_pc_fieldsplit_...  and the details for each
> split with -mg_levels_fieldsplit_
> >
> >See src/ksp/ksp/tutorials/ex42.c for example, usage
> >
> >Feel free to ask more specific questions once you get started.
> >
> >> On Jul 26, 2023, at 9:47 PM, Michael Wick 
> wrote:
> >>
> >> Hello PETSc team:
> >>
> >> I wonder if the current PETSc implementation supports using AMG
> monolithically for a multi-field problem and using fieldsplit in the
> smoother.
> >>
> >> Thank you very much,
> >>
> >> Mike
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Jed Brown
AMG is subtle here. With AMG for systems, you typically feed it elements of the 
near null space. In the case of (smoothed) aggregation, the coarse space will 
have a regular block structure with block sizes equal to the number of 
near-null vectors. You can use pc_fieldsplit options to select which fields you 
want in each split.

However, AMG also needs a strength of connection and if your system is so weird 
you need to fieldsplit the smoothers (e.g., a saddle point problem or a 
hyperbolic system) then it's likely that you'll also need a custom strength of 
connection to obtain reasonable coarsening.

Barry Smith  writes:

>   See the very end of the section 
> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how to 
> control the smoothers (and coarse grid solve) for multigrid in PETSc 
> including for algebraic multigrid.
>
>So, for example, -mg_levels_pc_type fieldsplit would be the starting 
> point. Depending on the block size of the matrices it may automatically do 
> simple splits, you can control the details  of the fieldsplit preconditioner 
> with -mg_levels_pc_fieldsplit_...  and the details for each split with 
> -mg_levels_fieldsplit_ 
>
>See src/ksp/ksp/tutorials/ex42.c for example, usage 
>
>Feel free to ask more specific questions once you get started.
>
>> On Jul 26, 2023, at 9:47 PM, Michael Wick  
>> wrote:
>> 
>> Hello PETSc team:
>> 
>> I wonder if the current PETSc implementation supports using AMG 
>> monolithically for a multi-field problem and using fieldsplit in the 
>> smoother.
>> 
>> Thank you very much,
>> 
>> Mike


Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Barry Smith

  See the very end of the section 
https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how to 
control the smoothers (and coarse grid solve) for multigrid in PETSc including 
for algebraic multigrid.

   So, for example, -mg_levels_pc_type fieldsplit would be the starting point. 
Depending on the block size of the matrices it may automatically do simple 
splits, you can control the details  of the fieldsplit preconditioner with 
-mg_levels_pc_fieldsplit_...  and the details for each split with 
-mg_levels_fieldsplit_ 

   See src/ksp/ksp/tutorials/ex42.c for example, usage 

   Feel free to ask more specific questions once you get started.

> On Jul 26, 2023, at 9:47 PM, Michael Wick  wrote:
> 
> Hello PETSc team:
> 
> I wonder if the current PETSc implementation supports using AMG 
> monolithically for a multi-field problem and using fieldsplit in the smoother.
> 
> Thank you very much,
> 
> Mike