Dear Artur, On 28/07/16 02:20, Safin, Artur wrote: > Barry, Lawrence, > >> I think the SubKSPs (and therefore SubPCs) are not set up until you call >> KSPSetUp(ksp) which your code does not do explicitly and is therefore done >> in KSPSolve. > > I added KSPSetUp(), but unfortunately the issue did not go away. > > > > I have created a MWE that replicates the issue. The program tries to solve a > tridiagonal system, where the first fieldsplit partitions the global matrix > > [ P x ] > [ x T ], > > and the nested fieldsplit partitions P into > > [ A x ] > [ x B ].
Two things: 1. Always check the return value from all PETSc calls. This will normally give you a very useful backtrace when something goes wrong. That is, annotate all your calls with: PetscErrorCode ierr; ierr = SomePetscFunction(...); CHKERRQ(ierr); If I do this, I see that the call to KSPSetUp fails: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Petsc has generated inconsistent data [0]PETSC ERROR: Unhandled case, must have at least two fields, not 1 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.7.2-931-g1e46b98 GIT Date: 2016-07-06 16:57:50 -0500 ... [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 470 in /data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 487 in /data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0]PETSC ERROR: #3 PCSetUp() line 968 in /data/lmitche1/src/deps/petsc/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #4 KSPSetUp() line 393 in /data/lmitche1/src/deps/petsc/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #5 main() line 65 in /homes/lmitche1/tmp/ex.c The reason is you need to call KSPSetUp *after* setting the outermost fieldsplit ISes. If I move the call to KSPSetUp, then things seem to work. I've attached the working code. Cheers, Lawrence $ cat options.txt -pc_type fieldsplit -pc_fieldsplit_type multiplicative -fieldsplit_T_ksp_type bcgs -fieldsplit_P_ksp_type gmres -fieldsplit_P_pc_type fieldsplit -fieldsplit_P_pc_fieldsplit_type multiplicative -fieldsplit_P_fieldsplit_A_ksp_type gmres -fieldsplit_P_fieldsplit_B_pc_type lu -fieldsplit_P_fieldsplit_B_ksp_type preonly -ksp_converged_reason -ksp_monitor_true_residual -ksp_view $ ./ex -options_file options.txt 0 KSP preconditioned resid norm 5.774607007892e+00 true resid norm 1.414213562373e+00 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 1.921795888956e-01 true resid norm 4.802975385197e-02 ||r(i)||/||b|| 3.396216464745e-02 2 KSP preconditioned resid norm 1.436304589027e-12 true resid norm 2.435255920058e-13 ||r(i)||/||b|| 1.721985974998e-13 Linear solve converged due to CONVERGED_RTOL iterations 2 KSP Object: 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with MULTIPLICATIVE composition: total splits = 2 Solver info for each split is in the following KSP objects: Split number 0 Defined by IS KSP Object: (fieldsplit_P_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_P_) 1 MPI processes type: fieldsplit FieldSplit with MULTIPLICATIVE composition: total splits = 2 Solver info for each split is in the following KSP objects: Split number 0 Defined by IS KSP Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=25, cols=25 package used to perform factorization: petsc total: nonzeros=73, allocated nonzeros=73 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes type: seqaij rows=25, cols=25 total: nonzeros=73, allocated nonzeros=73 total number of mallocs used during MatSetValues calls =0 not using I-node routines Split number 1 Defined by IS KSP Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1.43836 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=25, cols=25 package used to perform factorization: petsc total: nonzeros=105, allocated nonzeros=105 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes type: seqaij rows=25, cols=25 total: nonzeros=73, allocated nonzeros=73 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: (fieldsplit_P_) 1 MPI processes type: seqaij rows=50, cols=50 total: nonzeros=148, allocated nonzeros=148 total number of mallocs used during MatSetValues calls =0 not using I-node routines Split number 1 Defined by IS KSP Object: (fieldsplit_T_) 1 MPI processes type: bcgs maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_T_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=50, cols=50 package used to perform factorization: petsc total: nonzeros=148, allocated nonzeros=148 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: (fieldsplit_T_) 1 MPI processes type: seqaij rows=50, cols=50 total: nonzeros=148, allocated nonzeros=148 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=100, cols=100 total: nonzeros=298, allocated nonzeros=500 total number of mallocs used during MatSetValues calls =0 not using I-node routines
#include <petsc.h>
#define N 100
int main(int argc, char *argv[])
{
Mat A;
Vec u, b;
PetscErrorCode ierr;
KSP ksp;
PC pc;
unsigned int i;
PetscInitialize(&argc, &argv, (char*)0, "void");
ierr = MatCreate(PETSC_COMM_WORLD, &A);CHKERRQ(ierr);
ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N);CHKERRQ(ierr);
ierr = MatSetType(A, MATAIJ);CHKERRQ(ierr);
ierr = MatSetUp(A);CHKERRQ(ierr);
for (i = 0; i < N-1; ++i)
{
ierr = MatSetValue(A, i, i, 2, INSERT_VALUES);CHKERRQ(ierr);
ierr = MatSetValue(A, i+1, i, -1, INSERT_VALUES);CHKERRQ(ierr);
ierr = MatSetValue(A, i, i+1, -1, INSERT_VALUES);CHKERRQ(ierr);
}
ierr = MatSetValue(A, N-1, N-1, 2, INSERT_VALUES);CHKERRQ(ierr);
ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
ierr = VecCreate(PETSC_COMM_WORLD, &u);CHKERRQ(ierr);
ierr = VecSetSizes(u, PETSC_DECIDE, N);CHKERRQ(ierr);
ierr = VecSetFromOptions(u);CHKERRQ(ierr);
ierr = VecDuplicate(u, &b);CHKERRQ(ierr);
ierr = VecSet(u, 1);CHKERRQ(ierr);
ierr = MatMult(A, u, b);CHKERRQ(ierr);
// Create Index Sets
PetscInt P_A_indices[N/4];
PetscInt P_B_indices[N/4];
PetscInt T_indices[N/2];
PetscInt P_indices[N/2];
for (i = 0; i < N/2; ++i)
{
P_indices[i] = i;
T_indices[i] = i+N/2;
}
for (i = 0; i < N/4; ++i)
{
P_A_indices[i] = i;
P_B_indices[i] = i+N/4;
}
IS P_A_IS, P_B_IS, P_IS, T_IS;
ierr = ISCreateGeneral(PETSC_COMM_WORLD, N/2, P_indices, PETSC_COPY_VALUES, &P_IS);CHKERRQ(ierr);
ierr = ISCreateGeneral(PETSC_COMM_WORLD, N/2, T_indices, PETSC_COPY_VALUES, &T_IS);CHKERRQ(ierr);
ierr = ISCreateGeneral(PETSC_COMM_WORLD, N/4, P_A_indices, PETSC_COPY_VALUES, &P_A_IS);CHKERRQ(ierr);
ierr = ISCreateGeneral(PETSC_COMM_WORLD, N/4, P_B_indices, PETSC_COPY_VALUES, &P_B_IS);CHKERRQ(ierr);
// Solve the system
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp);CHKERRQ(ierr);
ierr = KSPSetType(ksp, KSPGMRES);CHKERRQ(ierr);
ierr = KSPSetOperators(ksp, A, A);CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr);
ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr);
// Define the fieldsplit for the global matrix
ierr = PCFieldSplitSetIS(pc, "P", P_IS);CHKERRQ(ierr);
ierr = PCFieldSplitSetIS(pc, "T", T_IS);CHKERRQ(ierr);
ierr = KSPSetUp(ksp);CHKERRQ(ierr);
// fieldsplit for submatrix P:
KSP *ksp_all, ksp_P;
PetscInt dummy;
ierr = PCFieldSplitGetSubKSP(pc, &dummy, &ksp_all);CHKERRQ(ierr);
ksp_P = ksp_all[0];
PC pc_P;
ierr = KSPGetPC(ksp_P, &pc_P);CHKERRQ(ierr);
ierr = PCFieldSplitSetIS(pc_P, "A", P_A_IS);CHKERRQ(ierr);
ierr = PCFieldSplitSetIS(pc_P, "B", P_B_IS);CHKERRQ(ierr);
ierr = KSPSolve(ksp, b, u);CHKERRQ(ierr);
PetscFinalize();
return 0;
}
signature.asc
Description: OpenPGP digital signature
