Please copy the list in your replies.
On 2 August 2015 at 19:07, Charles Cook <[email protected]> wrote:
> Sorry to be so dense, but how could I use something like AMG on the
> blocks?
>
Try (untested)
PETScOptions.set("pc_type", "bjacobi")
PETScOptions.set("sub_pc_type", "hypre")
PETScOptions.set("sub_pc_hypre_type", "boomeramg")
# Print PC details to check
PETScOptions.set("pc_view")
# Create solver and solve below
Garth
> I looked through the petsc krylov solver parameters and it wasn’t clear
> how one could set another preconditioner. I did see the fill level for ILU
> in there. It looked like it was missing parameter sets for the various
> preconditioners and a parameter on the solver to specify which one to use?
>
>
>
> Perhaps a demo for Stokes iterative with block Jacobi?
>
>
>
> Thank you again,
>
>
>
> Charles
>
>
>
>
>
>
>
>
>
> *From:* Garth N. Wells [mailto:[email protected]]
> *Sent:* Sunday, August 2, 2015 12:56 PM
>
> *To:* Charles Cook <[email protected]>
> *Cc:* [email protected]
> *Subject:* Re: [FEniCS-support] bjacobi preconditioner not found after
> installing 1.6?
>
>
>
>
>
> On Sunday, 2 August 2015, Charles Cook <[email protected]> wrote:
>
> ‘bjacobi’ is by far the best performing for me… Any recommendations on
> where I should go from there if it is being removed?
>
>
>
>
>
> It hasn't been removed. The interface has changed. If you don't set a
> preconditioner, with PETSc you get by default block Jacobi, with ILU on the
> blocks. If you use the PETSc options, you can control the preconditioner on
> the blocks, i.e. change to something other than ILU.
>
>
>
> Garth
>
>
>
>
>
> Thank you,
>
>
>
> Charles
>
>
>
> *From:* Garth N. Wells [mailto:[email protected]]
> *Sent:* Sunday, August 2, 2015 2:12 AM
> *To:* Charles Cook <[email protected]>
> *Cc:* [email protected]
> *Subject:* Re: [FEniCS-support] bjacobi preconditioner not found after
> installing 1.6?
>
>
>
>
>
> On 2 August 2015 at 00:00, Charles Cook <[email protected]> wrote:
>
> Hello,
>
>
>
> I’ve been working on installing the fenics project onto an HPC and thought
> this would also be a good time to upgrade to 1.6. After installing 1.6
> with hashdist the preconditioner I had been using appears to no longer be
> available, bjacobi.
>
>
>
>
>
> 'bjacobi' doesn't define a preconditoner. If you don't select a
> preconditioner, by default with PETSc you'll get block Jacobi, with the
> PETSc default solver for each process block.
>
>
>
> If you want to be sure of what you're getting, add
>
>
>
> PETScOptions::set("pc_type", "bjacobi");
>
>
>
> before creating the solver.
>
>
>
> Garth
>
>
>
>
>
> I assume I am missing a dependency in PETSc?
>
>
>
> I am using a local profile, mostly to get around a server being offline:
>
>
>
> extends:
>
> - file: linux.yaml
>
>
>
> packages:
>
> m4:
>
> launcher:
>
> cmake:
>
> python:
>
> link: shared
>
> mpi:
>
> use: mpich
>
> blas:
>
> use: openblas
>
> lapack:
>
> use: openblas
>
> hypre:
>
> with_openblas: true
>
> without_check: true
>
> petsc:
>
> build_with: |
>
> openblas, parmetis, scotch, suitesparse, superlu_dist, hypre
>
> download: |
>
> mumps, scalapack, blacs, ml, hypre, superlu, superlu_dista
>
> coptflags: -O2
>
> link: shared
>
> debug: false
>
> swig:
>
> boost:
>
> toolset: gcc
>
> build_with: python
>
> ipython:
>
> matplotlib:
>
> ffc:
>
> fiat:
>
> instant:
>
> ufl:
>
> uflacs:
>
> slepc:
>
> url: http://fenicsproject.org/pub/software/contrib/slepc-3.5.2.tar.gz
>
> dolfin:
>
> build_with: |
>
> openblas, hdf5, parmetis, petsc, petsc4py, suitesparse, scotch,
> slepc, slepc4py, vtk, zlib
>
> mpi4py:
>
> mshr:
>
>
>
>
>
> Looking into the source:
> https://bitbucket.org/fenics-project/dolfin/src/d50bf5ab9bb7c282b68c5d3c265be2660fedde19/dolfin/la/PETScPreconditioner.cpp?at=master
>
>
>
> It looks like the preconditioner should be there by default, though it is
> missing from the descriptions
>
>
>
> *// Mapping from preconditioner string to PETSc*
>
> *const* std*::*map*<*std*::*string, *const* PCType*>* PETScPreconditioner
> *::*_methods
>
> *=* { {"default", ""},
>
> {"ilu", PCILU},
>
> {"icc", PCICC},
>
> {"jacobi", PCJACOBI},
>
> * {**"bjacobi"**, PCBJACOBI},*
>
> {"sor", PCSOR},
>
> {"additive_schwarz", PCASM},
>
> {"petsc_amg", PCGAMG},
>
> *#if PETSC_HAVE_HYPRE*
>
> {"hypre_amg", PCHYPRE},
>
> {"hypre_euclid", PCHYPRE},
>
> {"hypre_parasails", PCHYPRE},
>
> *#endif*
>
> *#if PETSC_HAVE_ML*
>
> {"amg", PCML},
>
> {"ml_amg", PCML},
>
> *#elif PETSC_HAVE_HYPRE*
>
> {"amg", PCHYPRE},
>
> *#endif*
>
> {"none", PCNONE} };
>
>
>
>
>
> *// Mapping from preconditioner string to description string*
>
> *const* std*::*map*<*std*::*string, std*::*string*>*
>
> PETScPreconditioner*::*_methods_descr
>
> *=* { {"default", "default preconditioner"},
>
> {"ilu", "Incomplete LU factorization"},
>
> {"icc", "Incomplete Cholesky factorization"},
>
> {"sor", "Successive over-relaxation"},
>
> {"petsc_amg", "PETSc algebraic multigrid"},
>
> *#if PETSC_HAVE_HYPRE*
>
> {"amg", "Algebraic multigrid"},
>
> {"hypre_amg", "Hypre algebraic multigrid (BoomerAMG)"},
>
> {"hypre_euclid", "Hypre parallel incomplete LU factorization"},
>
> {"hypre_parasails", "Hypre parallel sparse approximate inverse"},
>
> *#endif*
>
> *#if PETSC_HAVE_ML*
>
> {"ml_amg", "ML algebraic multigrid"},
>
> *#endif*
>
> {"none", "No preconditioner"} };
>
>
>
> Any suggestions would be appreciated.
>
>
>
> As an aside, thanks again for this project! I’m looking to defend soon
> and this project has been critical to my research.
>
>
>
> Thank you,
>
>
>
> Charles
>
>
> _______________________________________________
> fenics-support mailing list
> [email protected]
> http://fenicsproject.org/mailman/listinfo/fenics-support
>
>
>
>
>
> --
>
> Garth N. Wells
>
> Department of Engineering, University of Cambridge
>
> http://www.eng.cam.ac.uk/~gnw20
>
>
>
_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support