Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-09 Thread Randall Mackie
Hi Ramoni,

All EM induction methods solved numerically like finite differences are 
difficult already because of the null-space of the curl-curl equations and then 
adding air layers on top of your model also introduce another singularity. 
These have been dealt with in the past by adding in some sort of divergence 
condition. Solving the curl-curl equations with a direct solution is fine, but 
iterative solutions are difficult.

There is no easy out of the box solution to this, but you can look at using 
multi-grid as a PC but this requires special care, for example:

https://academic.oup.com/gji/article-pdf/207/3/1554/6623047/ggw352.pdf 



A good way to stabilize curl curl solutions is by explicit inclusion of 
grad-div J:

https://academic.oup.com/gji/article/216/2/906/5154929 



Good luck


Randy Mackie


> On Nov 9, 2023, at 10:54 AM, Ramoni Z. Sedano Azevedo 
>  wrote:
> 
> We are solving the Direct Problem of Controlled Source Electromagnetics 
> (CSEM) using finite difference discretization.
> 
> Em qua., 8 de nov. de 2023 às 13:22, Jed Brown  > escreveu:
> What sort of problem are you solving? Algebraic multigrid like gamg or hypre 
> are good choices for elliptic problems. Sparse triangular solves have 
> horrific efficiency even on one GPU so you generally want to do your best to 
> stay away from them.
> 
> "Ramoni Z. Sedano Azevedo"  > writes:
> 
> > Hey!
> >
> > I am using PETSC in Fortran code and we apply the MPI process to
> > parallelize the code.
> >
> > At the moment, the options that have been used are
> > -ksp_monitor_true_residual
> > -ksp_type bcgs
> > -pc_type bjacobi
> > -sub_pc_type ilu
> > -sub_pc_factor_levels 3
> > -sub_pc_factor_fill 6
> >
> > Now, we want to use multiple GPUs and I would like to know if there is a
> > better solver and preconditioner pair to apply in this case.
> >
> > Yours sincerely,
> > Ramoni Z. S . Azevedo



Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-09 Thread Ramoni Z. Sedano Azevedo
We are solving the Direct Problem of Controlled Source Electromagnetics
(CSEM) using finite difference discretization.

Em qua., 8 de nov. de 2023 às 13:22, Jed Brown  escreveu:

> What sort of problem are you solving? Algebraic multigrid like gamg or
> hypre are good choices for elliptic problems. Sparse triangular solves have
> horrific efficiency even on one GPU so you generally want to do your best
> to stay away from them.
>
> "Ramoni Z. Sedano Azevedo"  writes:
>
> > Hey!
> >
> > I am using PETSC in Fortran code and we apply the MPI process to
> > parallelize the code.
> >
> > At the moment, the options that have been used are
> > -ksp_monitor_true_residual
> > -ksp_type bcgs
> > -pc_type bjacobi
> > -sub_pc_type ilu
> > -sub_pc_factor_levels 3
> > -sub_pc_factor_fill 6
> >
> > Now, we want to use multiple GPUs and I would like to know if there is a
> > better solver and preconditioner pair to apply in this case.
> >
> > Yours sincerely,
> > Ramoni Z. S . Azevedo
>


Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-08 Thread Jed Brown
What sort of problem are you solving? Algebraic multigrid like gamg or hypre 
are good choices for elliptic problems. Sparse triangular solves have horrific 
efficiency even on one GPU so you generally want to do your best to stay away 
from them.

"Ramoni Z. Sedano Azevedo"  writes:

> Hey!
>
> I am using PETSC in Fortran code and we apply the MPI process to
> parallelize the code.
>
> At the moment, the options that have been used are
> -ksp_monitor_true_residual
> -ksp_type bcgs
> -pc_type bjacobi
> -sub_pc_type ilu
> -sub_pc_factor_levels 3
> -sub_pc_factor_fill 6
>
> Now, we want to use multiple GPUs and I would like to know if there is a
> better solver and preconditioner pair to apply in this case.
>
> Yours sincerely,
> Ramoni Z. S . Azevedo


Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-08 Thread Barry Smith


Unfortunately, ILU(3) is not something that runs well on GPUs so ideally, 
we should find a preconditioner that works well in terms of iteration count but 
also runs well on GPUs.  You can start by saying a bit about the nature of your 
problem. Is it a PDE? What type of discretization? 

  Barry


> On Nov 8, 2023, at 9:53 AM, Ramoni Z. Sedano Azevedo 
>  wrote:
> 
> 
> Hey!
> 
> I am using PETSC in Fortran code and we apply the MPI process to parallelize 
> the code.
> 
> At the moment, the options that have been used are
> -ksp_monitor_true_residual 
> -ksp_type bcgs 
> -pc_type bjacobi 
> -sub_pc_type ilu 
> -sub_pc_factor_levels 3 
> -sub_pc_factor_fill 6 
> 
> Now, we want to use multiple GPUs and I would like to know if there is a 
> better solver and preconditioner pair to apply in this case.
> 
> Yours sincerely,
> Ramoni Z. S . Azevedo
> 
> 



[petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-08 Thread Ramoni Z. Sedano Azevedo
Hey!

I am using PETSC in Fortran code and we apply the MPI process to
parallelize the code.

At the moment, the options that have been used are
-ksp_monitor_true_residual
-ksp_type bcgs
-pc_type bjacobi
-sub_pc_type ilu
-sub_pc_factor_levels 3
-sub_pc_factor_fill 6

Now, we want to use multiple GPUs and I would like to know if there is a
better solver and preconditioner pair to apply in this case.

Yours sincerely,
Ramoni Z. S . Azevedo