The issue has already been discovered, see
http://fenicsproject.org/pipermail/fenics/2015-July/002875.html

Should we consider patching this by 1.6.1 release, Garth?

Jan


On Sat, 1 Aug 2015 19:00:39 -0400
"Charles Cook" <[email protected]> wrote:

> Hello,
> 
>  
> 
> I've been working on installing the fenics project onto an HPC and
> thought this would also be a good time to upgrade to 1.6.  After
> installing 1.6 with hashdist the preconditioner I had been using
> appears to no longer be available, bjacobi.
> 
>  
> 
> I assume I am missing a dependency in PETSc?  
> 
>  
> 
> I am using a local profile, mostly to get around a server being
> offline:
> 
>  
> 
> extends:
> 
> - file: linux.yaml 
> 
>  
> 
> packages:
> 
>   m4:
> 
>   launcher:
> 
>   cmake:
> 
>   python:
> 
>     link: shared
> 
>   mpi:
> 
>     use: mpich
> 
>   blas:
> 
>     use: openblas
> 
>   lapack:
> 
>     use: openblas
> 
>   hypre:
> 
>     with_openblas: true
> 
>     without_check: true
> 
>   petsc:
> 
>     build_with: |
> 
>       openblas, parmetis, scotch, suitesparse, superlu_dist, hypre
> 
>     download: |
> 
>       mumps, scalapack, blacs, ml, hypre, superlu, superlu_dista
> 
>     coptflags: -O2
> 
>     link: shared
> 
>     debug: false
> 
>   swig:
> 
>   boost:
> 
>     toolset: gcc
> 
>     build_with: python
> 
>   ipython:
> 
>   matplotlib:
> 
>   ffc:
> 
>   fiat:
> 
>   instant:
> 
>   ufl:
> 
>   uflacs:
> 
>   slepc:
> 
>     url:
> http://fenicsproject.org/pub/software/contrib/slepc-3.5.2.tar.gz
> 
>   dolfin:
> 
>     build_with: |
> 
>       openblas, hdf5, parmetis, petsc, petsc4py, suitesparse, scotch,
> slepc, slepc4py, vtk, zlib
> 
>   mpi4py:
> 
>   mshr:
> 
>  
> 
>  
> 
> Looking into the source:
> https://bitbucket.org/fenics-project/dolfin/src/d50bf5ab9bb7c282b68c5d3c265b
> e2660fedde19/dolfin/la/PETScPreconditioner.cpp?at=master
> 
>  
> 
> It looks like the preconditioner should be there by default, though
> it is missing from the descriptions
> 
>  
> // Mapping from preconditioner string to PETSc
> 
> const std::map<std::string, const PCType>
> PETScPreconditioner::_methods
> 
> = { {"default",          ""},
> 
>     {"ilu",              PCILU},
> 
>     {"icc",              PCICC},
> 
>     {"jacobi",           PCJACOBI},
> 
>     {"bjacobi",          PCBJACOBI},
> 
>     {"sor",              PCSOR},
> 
>     {"additive_schwarz", PCASM},
> 
>     {"petsc_amg",        PCGAMG},
> 
> #if PETSC_HAVE_HYPRE
> 
>     {"hypre_amg",        PCHYPRE},
> 
>     {"hypre_euclid",     PCHYPRE},
> 
>     {"hypre_parasails",  PCHYPRE},
> 
> #endif
> 
> #if PETSC_HAVE_ML
> 
>     {"amg",              PCML},
> 
>     {"ml_amg",           PCML},
> 
> #elif PETSC_HAVE_HYPRE
> 
>     {"amg",              PCHYPRE},
> 
> #endif
> 
>     {"none",             PCNONE} };
> 
>  
> 
>  
> 
> // Mapping from preconditioner string to description string
> const std::map<std::string, std::string>
> PETScPreconditioner::_methods_descr
> = { {"default",          "default preconditioner"},
>     {"ilu",              "Incomplete LU factorization"},
>     {"icc",              "Incomplete Cholesky factorization"},
>     {"sor",              "Successive over-relaxation"},
>     {"petsc_amg",        "PETSc algebraic multigrid"},
> #if PETSC_HAVE_HYPRE
>     {"amg",              "Algebraic multigrid"},
>     {"hypre_amg",        "Hypre algebraic multigrid (BoomerAMG)"},
>     {"hypre_euclid",     "Hypre parallel incomplete LU
> factorization"}, {"hypre_parasails",  "Hypre parallel sparse
> approximate inverse"}, #endif
> #if PETSC_HAVE_ML
>     {"ml_amg",           "ML algebraic multigrid"},
> #endif
>     {"none",             "No preconditioner"} };
> 
>  
> 
> Any suggestions would be appreciated.  
> 
>  
> 
> As an aside, thanks again for this project!  I'm looking to defend
> soon and this project has been critical to my research.
> 
>  
> 
> Thank you,
> 
>  
> 
> Charles
> 

_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support

Reply via email to