Hi Matthew,

Which database option are you referring to?

I tried to add -fieldsplit_mg_levels_ksp_type gmres (and -fieldsplit_mg_levels_ksp_max_it 4 for another run) to my options (cf. below) which starts the iterations but it takes 1 hour for PETSc to do 13 of them so it must be wrong.

Reminder: my baseline database options line reads

-ksp_view_pre -ksp_monitor -ksp_converged_reason \
-ksp_rtol 1.0e-8 -ksp_gmres_restart 300 \
-ksp_type fgmres \
-pc_type fieldsplit \
-pc_fieldsplit_type multiplicative \
-pc_fieldsplit_block_size 3 \
-pc_fieldsplit_0_fields 0   \
-pc_fieldsplit_1_fields 1   \
-pc_fieldsplit_2_fields 2   \
-fieldsplit_pc_type gamg    \
-fieldsplit_ksp_type gmres  \
-fieldsplit_ksp_rtol 1.0e-8

which gives

KSP Object: 1 MPI processes
  type: fgmres
    restart=300, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
    happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-08, absolute=1e-50, divergence=10000.
  left preconditioning
  using DEFAULT norm type for convergence test
PC Object: 1 MPI processes
  type: fieldsplit
  PC has not been set up so information may be incomplete
    FieldSplit with MULTIPLICATIVE composition: total splits = 3, blocksize = 3
    Solver info for each split is in the following KSP objects:
  Split number 0 Fields  0
  KSP Object: (fieldsplit_0_) 1 MPI processes
    type: preonly
    maximum iterations=10000, initial guess is zero
    tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
    left preconditioning
    using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_0_) 1 MPI processes
    type not yet set
    PC has not been set up so information may be incomplete
  Split number 1 Fields  1
  KSP Object: (fieldsplit_1_) 1 MPI processes
    type: preonly
    maximum iterations=10000, initial guess is zero
    tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
    left preconditioning
    using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_1_) 1 MPI processes
    type not yet set
    PC has not been set up so information may be incomplete
  Split number 2 Fields  2
  KSP Object: (fieldsplit_2_) 1 MPI processes
    type: preonly
    maximum iterations=10000, initial guess is zero
    tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
    left preconditioning
    using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_2_) 1 MPI processes
    type not yet set
    PC has not been set up so information may be incomplete
  linear system matrix = precond matrix:
  Mat Object: 1 MPI processes
    type: seqaij
    rows=52500, cols=52500
    total: nonzeros=1127079, allocated nonzeros=1128624
    total number of mallocs used during MatSetValues calls =0
      not using I-node routines
  0 KSP Residual norm 3.583290589961e+00
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at iteration 0[0]PETSC ERROR: Petsc Release Version 3.10.2, unknown [0]PETSC ERROR: Configure options --PETSC_ARCH=msi_cplx_debug --with-scalar-type=complex --with-precision=double --with-debugging=1 --with-valgrind=1 --with-debugger=gdb --with-fortran-kernels=1 --download-mpich --download-hwloc --download-fblaslapack --download-scalapack --download-metis --download-parmetis --download-ptscotch --download-mumps --download-slepc [0]PETSC ERROR: #1 KSPSolve_Chebyshev() line 381 in /home/thibaut/Packages/petsc/src/ksp/ksp/impls/cheby/cheby.c [0]PETSC ERROR: #2 KSPSolve() line 780 in /home/thibaut/Packages/petsc/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #3 PCMGMCycle_Private() line 20 in /home/thibaut/Packages/petsc/src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: #4 PCApply_MG() line 377 in /home/thibaut/Packages/petsc/src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: #5 PCApply() line 462 in /home/thibaut/Packages/petsc/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #6 KSP_PCApply() line 281 in /home/thibaut/Packages/petsc/include/petsc/private/kspimpl.h [0]PETSC ERROR: #7 KSPInitialResidual() line 67 in /home/thibaut/Packages/petsc/src/ksp/ksp/interface/itres.c [0]PETSC ERROR: #8 KSPSolve_GMRES() line 233 in /home/thibaut/Packages/petsc/src/ksp/ksp/impls/gmres/gmres.c [0]PETSC ERROR: #9 KSPSolve() line 780 in /home/thibaut/Packages/petsc/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #10 PCApply_FieldSplit() line 1107 in /home/thibaut/Packages/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0]PETSC ERROR: #11 PCApply() line 462 in /home/thibaut/Packages/petsc/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #12 KSP_PCApply() line 281 in /home/thibaut/Packages/petsc/include/petsc/private/kspimpl.h [0]PETSC ERROR: #13 KSPFGMRESCycle() line 166 in /home/thibaut/Packages/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c [0]PETSC ERROR: #14 KSPSolve_FGMRES() line 291 in /home/thibaut/Packages/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c [0]PETSC ERROR: #15 KSPSolve() line 780 in /home/thibaut/Packages/petsc/src/ksp/ksp/interface/itfunc.c


Thibaut



On 30/10/2018 23:12, Matthew Knepley wrote:
On Tue, Oct 30, 2018 at 5:21 PM Appel, Thibaut via petsc-users <petsc-users@mcs.anl.gov <mailto:petsc-users@mcs.anl.gov>> wrote:

    Dear users,

    Following a suggestion from Matthew Knepley I’ve been trying to
    apply fieldsplit/gamg for my set of PDEs but I’m still
    encountering issues despite various tests. pc_gamg simply won’t start.
    Note that direct solvers always yield the correct, physical result.
Removing the fieldsplit to focus on the gamg bit

This is a mistake I think because you want a simpler operator to try GAMG on.

Can you go back to splitting the system, and try using GMRES as the smoother. Its important to see
whether the smoother makes no progress, or the coarse correction stinks.

  Thanks,

     Matt

    and trying to solve the linear system on a modest size problem
    still gives, with

    '-ksp_monitor -ksp_rtol 1.0e-10 -ksp_gmres_restart 300 -ksp_type
    gmres -pc_type gamg'

    [3]PETSC ERROR: --------------------- Error Message
    --------------------------------------------------------------
    [3]PETSC ERROR: Petsc has generated inconsistent data
    [3]PETSC ERROR: Have un-symmetric graph (apparently). Use
    '-(null)pc_gamg_sym_graph true' to symetrize the graph or
    '-(null)pc_gamg_threshold -1' if the matrix is structurally symmetric.

    And since then, after adding '-pc_gamg_sym_graph true' I have been
    getting
    [0]PETSC ERROR: --------------------- Error Message
    --------------------------------------------------------------
    [0]PETSC ERROR: Petsc has generated inconsistent data
    [0]PETSC ERROR: Eigen estimator failed: DIVERGED_NANORINF at iteration

    -ksp_chebyshev_esteig_noisy 0/1 does not change anything

    Knowing that Chebyshev eigen estimator needs a positive spectrum I
    tried ‘-mg_levels_ksp_type gmres’ but iterations would just go on
    endlessly.

    It seems that I have indeed eigenvalues of rather high magnitude
    in the spectrum of my operator without being able to determine the
    reason.
    The eigenvectors look like small artifacts at the wall-inflow or
    wall-outflow corners with zero anywhere else but I do not know how
    to interpret this.
    Equations are time-harmonic linearized Navier-Stokes to which a
    forcing is applied, there’s no time-marching.

    Matrix is formed with a MPIAIJ type. The formulation is
    incompressible, in complex arithmetic and the 2D physical domain
    is mapped to a logically rectangular, regular collocated grid with
    a high-order finite difference method.
    I determine the ownership of the rows/degrees of freedom of the
    matrix with PetscSplitOwnership and I’m not using DMDA.

    The Fortran application code is memory-leak free and has undergone
    a strict verification/validation procedure for different
    variations of the PDEs.

    If there’s any problem with the matrix what could help for the
    diagnostic? At this point I’m running out of ideas so I would
    really appreciate additional suggestions and discussions.

    Thanks for your continued support,


    Thibaut



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

Reply via email to