Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-09 Thread Randall Mackie
Hi Ramoni, All EM induction methods solved numerically like finite differences are difficult already because of the null-space of the curl-curl equations and then adding air layers on top of your model also introduce another singularity. These have been dealt with in the past by adding in some

Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-09 Thread Ramoni Z. Sedano Azevedo
We are solving the Direct Problem of Controlled Source Electromagnetics (CSEM) using finite difference discretization. Em qua., 8 de nov. de 2023 às 13:22, Jed Brown escreveu: > What sort of problem are you solving? Algebraic multigrid like gamg or > hypre are good choices for elliptic

Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-08 Thread Jed Brown
What sort of problem are you solving? Algebraic multigrid like gamg or hypre are good choices for elliptic problems. Sparse triangular solves have horrific efficiency even on one GPU so you generally want to do your best to stay away from them. "Ramoni Z. Sedano Azevedo" writes: > Hey! > > I

Re: [petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-08 Thread Barry Smith
Unfortunately, ILU(3) is not something that runs well on GPUs so ideally, we should find a preconditioner that works well in terms of iteration count but also runs well on GPUs. You can start by saying a bit about the nature of your problem. Is it a PDE? What type of discretization?

[petsc-users] Better solver and preconditioner to use multiple GPU

2023-11-08 Thread Ramoni Z. Sedano Azevedo
Hey! I am using PETSC in Fortran code and we apply the MPI process to parallelize the code. At the moment, the options that have been used are -ksp_monitor_true_residual -ksp_type bcgs -pc_type bjacobi -sub_pc_type ilu -sub_pc_factor_levels 3 -sub_pc_factor_fill 6 Now, we want to use multiple