Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Matthew Knepley
On Thu, Jul 27, 2023 at 12:48 AM Jed Brown  wrote:

> AMG is subtle here. With AMG for systems, you typically feed it elements
> of the near null space. In the case of (smoothed) aggregation, the coarse
> space will have a regular block structure with block sizes equal to the
> number of near-null vectors. You can use pc_fieldsplit options to select
> which fields you want in each split.
>
> However, AMG also needs a strength of connection and if your system is so
> weird you need to fieldsplit the smoothers (e.g., a saddle point problem or
> a hyperbolic system) then it's likely that you'll also need a custom
> strength of connection to obtain reasonable coarsening.
>

For this reason, sometimes GMG is easier for systems since you just
rediscretize.

  Thanks,

 Matt


> Barry Smith  writes:
>
> >   See the very end of the section
> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how to
> control the smoothers (and coarse grid solve) for multigrid in PETSc
> including for algebraic multigrid.
> >
> >So, for example, -mg_levels_pc_type fieldsplit would be the starting
> point. Depending on the block size of the matrices it may automatically do
> simple splits, you can control the details  of the fieldsplit
> preconditioner with -mg_levels_pc_fieldsplit_...  and the details for each
> split with -mg_levels_fieldsplit_
> >
> >See src/ksp/ksp/tutorials/ex42.c for example, usage
> >
> >Feel free to ask more specific questions once you get started.
> >
> >> On Jul 26, 2023, at 9:47 PM, Michael Wick 
> wrote:
> >>
> >> Hello PETSc team:
> >>
> >> I wonder if the current PETSc implementation supports using AMG
> monolithically for a multi-field problem and using fieldsplit in the
> smoother.
> >>
> >> Thank you very much,
> >>
> >> Mike
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Jed Brown
AMG is subtle here. With AMG for systems, you typically feed it elements of the 
near null space. In the case of (smoothed) aggregation, the coarse space will 
have a regular block structure with block sizes equal to the number of 
near-null vectors. You can use pc_fieldsplit options to select which fields you 
want in each split.

However, AMG also needs a strength of connection and if your system is so weird 
you need to fieldsplit the smoothers (e.g., a saddle point problem or a 
hyperbolic system) then it's likely that you'll also need a custom strength of 
connection to obtain reasonable coarsening.

Barry Smith  writes:

>   See the very end of the section 
> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how to 
> control the smoothers (and coarse grid solve) for multigrid in PETSc 
> including for algebraic multigrid.
>
>So, for example, -mg_levels_pc_type fieldsplit would be the starting 
> point. Depending on the block size of the matrices it may automatically do 
> simple splits, you can control the details  of the fieldsplit preconditioner 
> with -mg_levels_pc_fieldsplit_...  and the details for each split with 
> -mg_levels_fieldsplit_ 
>
>See src/ksp/ksp/tutorials/ex42.c for example, usage 
>
>Feel free to ask more specific questions once you get started.
>
>> On Jul 26, 2023, at 9:47 PM, Michael Wick  
>> wrote:
>> 
>> Hello PETSc team:
>> 
>> I wonder if the current PETSc implementation supports using AMG 
>> monolithically for a multi-field problem and using fieldsplit in the 
>> smoother.
>> 
>> Thank you very much,
>> 
>> Mike


Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Barry Smith

  See the very end of the section 
https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how to 
control the smoothers (and coarse grid solve) for multigrid in PETSc including 
for algebraic multigrid.

   So, for example, -mg_levels_pc_type fieldsplit would be the starting point. 
Depending on the block size of the matrices it may automatically do simple 
splits, you can control the details  of the fieldsplit preconditioner with 
-mg_levels_pc_fieldsplit_...  and the details for each split with 
-mg_levels_fieldsplit_ 

   See src/ksp/ksp/tutorials/ex42.c for example, usage 

   Feel free to ask more specific questions once you get started.

> On Jul 26, 2023, at 9:47 PM, Michael Wick  wrote:
> 
> Hello PETSc team:
> 
> I wonder if the current PETSc implementation supports using AMG 
> monolithically for a multi-field problem and using fieldsplit in the smoother.
> 
> Thank you very much,
> 
> Mike



[petsc-users] Monolithic AMG with fieldsplit as smoother

2023-07-26 Thread Michael Wick
Hello PETSc team:

I wonder if the current PETSc implementation supports using AMG
monolithically for a multi-field problem and using fieldsplit in the
smoother.

Thank you very much,

Mike


Re: [petsc-users] Inverse of a Large Sparse Matrix

2023-07-26 Thread Matthew Knepley
On Wed, Jul 26, 2023 at 8:13 AM maitri ksh  wrote:

> I have a large sparse matrix (48x48) and I need to take its
> inverse and use it to solve an eigenvalue problem. According to
> petsc-archive
> ,
> Barry suggested using  superLU with MatMatSolve() (& not KSPsolver) for
> matrix sizes of 5000 to 2. I was wondering two things:
> a) is it possible to avoid taking the explicit inverse of the large sparse
> matrix (as was discussed in the archive) for this particular case (in which
> I am using the matrix-inverse for an eigenvalue problem)
>

You do not actually want to explicitly invert in most instances. SLEPc will
do the right thing automatically.


> b) is KSPsolver more suitable here?
>

Most likely.


> Also, can someone please explain why SuperLU (SuperLU_SEQUENTIAL, which
> does not involve any parallel computation) is more efficient in dealing
> with large sparse matrices as compared to MATLAB's inbuilt LU
>  solver.
>

I have no idea what MATLAB is doing. However SuperLU uses the supernodal
formulation, and I am not sure that MATLAB does. If you care about the last
ounce of performance, it is always worth trying several packages, so you
might compare with the serial PETSc LU and MUMPS.

  Thanks,

 Matt

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


[petsc-users] Inverse of a Large Sparse Matrix

2023-07-26 Thread maitri ksh
I have a large sparse matrix (48x48) and I need to take its inverse
and use it to solve an eigenvalue problem. According to petsc-archive
,
Barry suggested using  superLU with MatMatSolve() (& not KSPsolver) for
matrix sizes of 5000 to 2. I was wondering two things:
a) is it possible to avoid taking the explicit inverse of the large sparse
matrix (as was discussed in the archive) for this particular case (in which
I am using the matrix-inverse for an eigenvalue problem)
b) is KSPsolver more suitable here?

Also, can someone please explain why SuperLU (SuperLU_SEQUENTIAL, which
does not involve any parallel computation) is more efficient in dealing
with large sparse matrices as compared to MATLAB's inbuilt LU
 solver.