Re: [petsc-users] GAMG and linearized elasticity

2022-12-13 Thread Jed Brown
Do you have slip/symmetry boundary conditions, where some components are 
constrained? In that case, there is no uniform block size and I think you'll 
need DMPlexCreateRigidBody() and MatSetNearNullSpace().

The PCSetCoordinates() code won't work for non-constant block size.

-pc_type gamg should work okay out of the box for elasticity. For hypre, I've 
had good luck with this options suite, which also runs on GPU.

-pc_type hypre -pc_hypre_boomeramg_coarsen_type pmis 
-pc_hypre_boomeramg_interp_type ext+i -pc_hypre_boomeramg_no_CF 
-pc_hypre_boomeramg_P_max 6 -pc_hypre_boomeramg_relax_type_down Chebyshev 
-pc_hypre_boomeramg_relax_type_up Chebyshev 
-pc_hypre_boomeramg_strong_threshold 0.5

Blaise Bourdin  writes:

> Hi,
>
> I am getting close to finish porting a code from petsc 3.3 / sieve to main / 
> dmplex, but am
> now encountering difficulties 
> I am reasonably sure that the Jacobian and residual are correct. The codes 
> handle boundary
> conditions differently (MatZeroRowCols vs dmplex constraints) so it is not 
> trivial to compare
> them. Running with snes_type ksponly pc_type Jacobi or hyper gives me the 
> same results in
> roughly the same number of iterations.
>
> In my old code, gamg would work out of the box. When using petsc-main, 
> -pc_type gamg -
> pc_gamg_type agg works for _some_ problems using P1-Lagrange elements, but 
> never for
> P2-Lagrange. The typical error message is in gamg_agg.txt
>
> When using -pc_type classical, a problem where the KSP would converge in 47 
> iteration in
> 3.3 now takes 1400.  ksp_view_3.3.txt and ksp_view_main.txt show the output 
> of -ksp_view
> for both versions. I don’t notice anything obvious.
>
> Strangely, removing the call to PCSetCoordinates does not have any impact on 
> the
> convergence.
>
> I am sure that I am missing something, or not passing the right options. 
> What’s a good
> starting point for 3D elasticity?
> Regards,
> Blaise
>
> — 
> Canada Research Chair in Mathematical and Computational Aspects of Solid 
> Mechanics
> (Tier 1)
> Professor, Department of Mathematics & Statistics
> Hamilton Hall room 409A, McMaster University
> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243
> [0]PETSC ERROR: - Error Message 
> --
> [0]PETSC ERROR: Petsc has generated inconsistent data
> [0]PETSC ERROR: Computed maximum singular value as zero
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be 
> the program crashed before they were used or a spelling mistake, etc!
> [0]PETSC ERROR: Option left: name:-displacement_ksp_converged_reason value: 
> ascii source: file
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.18.2-341-g16200351da0  GIT 
> Date: 2022-12-12 23:42:20 +
> [0]PETSC ERROR: 
> /home/bourdinb/Development/mef90/mef90-dmplex/bbserv-gcc11.2.1-mvapich2-2.3.7-O/bin/ThermoElasticity
>  on a bbserv-gcc11.2.1-mvapich2-2.3.7-O named bb01 by bourdinb Tue Dec 13 
> 17:02:19 2022
> [0]PETSC ERROR: Configure options --CFLAGS=-Wunused 
> --FFLAGS="-ffree-line-length-none -fallow-argument-mismatch -Wunused" 
> --COPTFLAGS="-O2 -march=znver2" --CXXOPTFLAGS="-O2 -march=znver2" 
> --FOPTFLAGS="-O2 -march=znver2" --download-chaco=1 --download-exodusii=1 
> --download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 
> --download-metis=1 --download-ml=1 --download-mumps=1 --download-netcdf=1 
> --download-p4est=1 --download-parmetis=1 --download-pnetcdf=1 
> --download-scalapack=1 --download-sowing=1 
> --download-sowing-cc=/opt/rh/devtoolset-9/root/usr/bin/gcc 
> --download-sowing-cxx=/opt/rh/devtoolset-9/root/usr/bin/g++ 
> --download-sowing-cpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
> --download-sowing-cxxcpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
> --download-superlu=1 --download-triangle=1 --download-yaml=1 
> --download-zlib=1 --with-debugging=0 
> --with-mpi-dir=/opt/HPC/mvapich2/2.3.7-gcc11.2.1 --with-pic 
> --with-shared-libraries=1 --with-mpiexec=srun --with-x11=0
> [0]PETSC ERROR: #1 PCGAMGOptProlongator_AGG() at 
> /1/HPC/petsc/main/src/ksp/pc/impls/gamg/agg.c:779
> [0]PETSC ERROR: #2 PCSetUp_GAMG() at 
> /1/HPC/petsc/main/src/ksp/pc/impls/gamg/gamg.c:639
> [0]PETSC ERROR: #3 PCSetUp() at 
> /1/HPC/petsc/main/src/ksp/pc/interface/precon.c:994
> [0]PETSC ERROR: #4 KSPSetUp() at 
> /1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:405
> [0]PETSC ERROR: #5 KSPSolve_Private() at 
> /1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:824
> [0]PETSC ERROR: #6 KSPSolve() at 
> /1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:1070
> [0]PETSC ERROR: #7 SNESSolve_KSPONLY() at 
> /1/HPC/petsc/main/src/snes/impls/ksponly/ksponly.c:48
> [0]PETSC ERROR: #8 SNESSolve() at 
> /1/HPC/petsc/main/src/snes/interface/snes.c:4693
> [0]PETSC ERROR: #9 
> 

[petsc-users] GAMG and linearized elasticity

2022-12-13 Thread Blaise Bourdin




Hi,

I am getting close to finish porting a code from petsc 3.3 / sieve to main / dmplex, but am now encountering difficulties

I am reasonably sure that the Jacobian and residual are correct. The codes handle boundary conditions differently (MatZeroRowCols vs dmplex constraints) so it is not trivial to compare them. Running with snes_type ksponly pc_type Jacobi or hyper gives me the
 same results in roughly the same number of iterations.

In my old code, gamg would work out of the box. When using petsc-main, -pc_type gamg -pc_gamg_type agg works for _some_ problems using P1-Lagrange elements, but never for P2-Lagrange. The typical error message is in gamg_agg.txt

When using -pc_type classical, a problem where the KSP would converge in 47 iteration in 3.3 now takes 1400.  ksp_view_3.3.txt and ksp_view_main.txt show the output of -ksp_view for both versions. I don’t notice anything obvious.

Strangely, removing the call to PCSetCoordinates does not have any impact on the convergence.

I am sure that I am missing something, or not passing the right options. What’s a good starting point for 3D elasticity?
Regards,
Blaise






— 
Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243





[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Computed maximum singular value as zero
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be 
the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-displacement_ksp_converged_reason value: 
ascii source: file
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.2-341-g16200351da0  GIT 
Date: 2022-12-12 23:42:20 +
[0]PETSC ERROR: 
/home/bourdinb/Development/mef90/mef90-dmplex/bbserv-gcc11.2.1-mvapich2-2.3.7-O/bin/ThermoElasticity
 on a bbserv-gcc11.2.1-mvapich2-2.3.7-O named bb01 by bourdinb Tue Dec 13 
17:02:19 2022
[0]PETSC ERROR: Configure options --CFLAGS=-Wunused 
--FFLAGS="-ffree-line-length-none -fallow-argument-mismatch -Wunused" 
--COPTFLAGS="-O2 -march=znver2" --CXXOPTFLAGS="-O2 -march=znver2" 
--FOPTFLAGS="-O2 -march=znver2" --download-chaco=1 --download-exodusii=1 
--download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 
--download-metis=1 --download-ml=1 --download-mumps=1 --download-netcdf=1 
--download-p4est=1 --download-parmetis=1 --download-pnetcdf=1 
--download-scalapack=1 --download-sowing=1 
--download-sowing-cc=/opt/rh/devtoolset-9/root/usr/bin/gcc 
--download-sowing-cxx=/opt/rh/devtoolset-9/root/usr/bin/g++ 
--download-sowing-cpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
--download-sowing-cxxcpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
--download-superlu=1 --download-triangle=1 --download-yaml=1 --download-zlib=1 
--with-debugging=0 --with-mpi-dir=/opt/HPC/mvapich2/2.3.7-gcc11.2.1 --with-pic 
--with-shared-libraries=1 --with-mpiexec=srun --with-x11=0
[0]PETSC ERROR: #1 PCGAMGOptProlongator_AGG() at 
/1/HPC/petsc/main/src/ksp/pc/impls/gamg/agg.c:779
[0]PETSC ERROR: #2 PCSetUp_GAMG() at 
/1/HPC/petsc/main/src/ksp/pc/impls/gamg/gamg.c:639
[0]PETSC ERROR: #3 PCSetUp() at 
/1/HPC/petsc/main/src/ksp/pc/interface/precon.c:994
[0]PETSC ERROR: #4 KSPSetUp() at 
/1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:405
[0]PETSC ERROR: #5 KSPSolve_Private() at 
/1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:824
[0]PETSC ERROR: #6 KSPSolve() at 
/1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:1070
[0]PETSC ERROR: #7 SNESSolve_KSPONLY() at 
/1/HPC/petsc/main/src/snes/impls/ksponly/ksponly.c:48
[0]PETSC ERROR: #8 SNESSolve() at 
/1/HPC/petsc/main/src/snes/interface/snes.c:4693
[0]PETSC ERROR: #9 
/home/bourdinb/Development/mef90/mef90-dmplex/ThermoElasticity/ThermoElasticity.F90:228
  Linear solve converged due to CONVERGED_RTOL iterations 46
KSP Object:(Disp_) 32 MPI processes
  type: cg
  maximum iterations=1
  tolerances:  relative=1e-05, absolute=1e-08, divergence=1e+10
  left preconditioning
  using nonzero initial guess
  using PRECONDITIONED norm type for convergence test
PC Object:(Disp_) 32 MPI processes
  type: gamg
MG: type is MULTIPLICATIVE, levels=4 cycles=v
  Cycles per PCApply=1
  Using Galerkin computed coarse grid matrices
  Coarse grid solver -- level ---
KSP Object:(Disp_mg_coarse_) 32 MPI processes
  type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
  maximum iterations=1, initial