Hi,

Not on the PETSc team but some experience with these two multilevel 
preconditioners.  For starters take a look at this publication by one
of the HYPRE team members on parameter choices for 2D and 3D Poisson problems 
that deliver the best performance.  Pay particular
attention to p. 18-22.  There are many knobs with these solvers (in particular 
BoomerAMG) and they may need tweaking to improve 
performance.

https://computation.llnl.gov/casc/linear_solvers/pubs/yang1.pdf

Also, what is your definition of poor scalability?  With respect to increasing 
processor count (i.e., parallel scalability) or with respect to
performance based on increasing problem size?  Both of these preconditiioners 
have been thoroughly tested for Poisson-style
problems and I'd be surprised if you don't get (at least) good scalability with 
respect to problem size?

Travis

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Travis Austin, Ph.D.
Tech-X Corporation
5621 Arapahoe Ave, Suite A
Boulder, CO 80303
austin at txcorp.com
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


On May 19, 2011, at 5:56 PM, Li, Zhisong (lizs) wrote:

> Hi, Petsc Team,
> 
> Recently I tested my 3D structured Poisson-style problem with ML and 
> BoomerAMG preconditioner respectively.  In comparison, ML is more efficient 
> in preconditioning and RAM usage, but it requires 2 times more iterations on 
> the same KSP solver, bringing down the overall efficiency.  And both PCs 
> don't scale well.  I wonder if there's any specific approach to optimizing ML 
> to reduce KSP iterations by setting certain command line options.
> 
> I also saw in some previous petsc mail archives mentioning the "local 
> preconditioner".  As some important PC like PCILU and PCICC are not available 
> for parallel processing, it may be beneficial to apply them as local 
> preconditioners.  The question is how to setup a local preconditioner?
> 
> 
> Thank you veru much.
> 
> 
> 
> Zhisong Li

-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110519/fd5a3916/attachment.htm>

Reply via email to