Thank you Mark, Barry for the advice.

-ksp_type richardson may be what I am looking for. I wanted a simple iterative 
scheme that allows AMG to be the main work horse – this allows that.
On using -ksp_type preonly, I observed only 1 KSP iteration and it does not 
drop the residual norm to my desired level (-ksp_rtol 1e-7). Whereas when I use 
gmres or Richardson I see 6-10 ksp iterations and the desired residual drop. Is 
it possible to increase (and fix) the number of KSP iterations with preonly? 
You mentioned below that it gives only one cycle so I presume it is not 
possible.

Geometric mg is not a feasible option for my application as I will be 
ultimately dealing with 3D, unstructured meshes for hp-finite element methods. 
Some clever tweaks may allow it (like having sub-structured meshes) but not as 
of now.

Best
Parv



From: Barry Smith <[email protected]>
Sent: 05 July 2023 16:51
To: Khurana, Parv <[email protected]>
Cc: [email protected]
Subject: Re: [petsc-users] Running AMG libraries as standalone solvers

This email from [email protected]<mailto:[email protected]> originates from 
outside Imperial. Do not click on links and attachments unless you recognise 
the sender. If you trust the sender, add them to your safe senders 
list<https://spam.ic.ac.uk/SpamConsole/Senders.aspx> to disable email stamping 
for this address.



   BTW: in my experience

Geometric multigrid generally works great as a standalone solver, that is 
accelerating it with a Krylov method is not needed, and while it may improve 
the convergence rate slightly, it may end up taking a bit more time then not 
using the Krylov solver (due to the extra Krylov solver overhead).

Algebraic multigrid is usually run with a Krylov solver and running without a 
Krylov solver, as far as I am aware, generally performs poorly.



On Jul 5, 2023, at 11:46 AM, Barry Smith 
<[email protected]<mailto:[email protected]>> wrote:


  -ksp_type richardson

  If you use -ksp_typre preonly you effectively get one V-cycle (or whatever 
cycle type you may have selected) which in general, will give you a more or 
less garbage answer




On Jul 5, 2023, at 11:28 AM, Khurana, Parv 
<[email protected]<mailto:[email protected]>> wrote:

Hello PETSc users,

I am fairly inexperienced with PETSc as of now and am new to the mailing list! 
Thanks for running this channel.

I seek basic help regarding running AMG routines (BoomerAMG/ML/GAMG). I am 
trying to compare the performance of solving a Poisson problem using a AMG 
Preconditioned GMRES iterative solve vs using AMG as the solver. I use PETSc 
options using the options database keys as of now, and it is connected to a 
flow solver (Nektar++) I use for my research.

I currently run the AMG Preconditioned GMRES iterative solve by setting 
-ksp_type gmres and then specifying the preconditioner I want using, for e.g, 
-pc_type hypre -pc_hypre_type boomeramg. If I want to use the AMG routine, I am 
currently setting -ksp_type preonly and the same -pc_type. However, I am not 
sure if this is correct way to go about it due to two reasons:  a) my solution 
using AMG as a solver with these options has a larger error than AMG 
Preconditioned GMRES (which could still be acceptable), and b) I could not find 
any clear documentation regarding how to use AMG directly as a solver. I saw 
some hints in the examples 
herehttps://petsc.org/main/tutorials/handson/#<https://petsc.org/main/tutorials/handson/>,
 but it hasn’t helped me.

Any hints on how to use AMG directly as a solver?

Best
Parv


Reply via email to