Matt, Attached is a small Fortran code that replicates the second problem.
Chris [cid:[email protected]][cid:[email protected]] dr. ir. Christiaan Klaij CFD Researcher Research & Development MARIN 2, Haagsteeg E [email protected]<mailto:[email protected]> P.O. Box 28 T +31 317 49 39 11 6700 AA Wageningen F +31 317 49 32 45 T +31 317 49 33 44 The Netherlands I www.marin.nl<http://www.marin.nl> MARIN news: MARIN at SMM, Hamburg, September 9-12<http://www.marin.nl/web/News/News-items/MARIN-at-SMM-Hamburg-September-912.htm> This e-mail may be confidential, privileged and/or protected by copyright. If you are not the intended recipient, you should return it to the sender immediately and delete your copy from your system. ________________________________ From: Klaij, Christiaan Sent: Friday, August 29, 2014 4:42 PM To: Matthew Knepley Cc: [email protected] Subject: RE: [petsc-users] PCFieldSplitSetSchurPre in fortran Matt, The small test code (ex70) is in C and it works fine, the problem happens in a big Fortran code. I will try to replicate the problem in a small Fortran code, but that will take some time. Chris ________________________________ From: Matthew Knepley <[email protected]> Sent: Friday, August 29, 2014 4:14 PM To: Klaij, Christiaan Cc: [email protected] Subject: Re: [petsc-users] PCFieldSplitSetSchurPre in fortran On Fri, Aug 29, 2014 at 8:55 AM, Klaij, Christiaan <[email protected]<mailto:[email protected]>> wrote: I'm trying PCFieldSplitSetSchurPre with PC_FIELDSPLIT_SCHUR_PRE_SELFP in petsc-3.5.1 using fortran. The first problem is that PC_FIELDSPLIT_SCHUR_PRE_SELFP seems to be missing in fortran, I get the compile error: This name does not have a type, and must have an explicit type. [PC_FIELDSPLIT_SCHUR_PRE_SELFP] while compilation works fine with _A11, _SELF and _USER. Mark Adams has just fixed this. The second problem is that the call doesn't seem to have any effect. For example, I have CALL PCFieldSplitSetSchurPre(pc,PC_FIELDSPLIT_SCHUR_PRE_USER,aa,ierr) CALL PCFieldSplitSetSchurFactType(pc,PC_FIELDSPLIT_SCHUR_FACT_LOWER,ierr) This compiles and runs, but ksp_view tells me PC Object:(sys_) 3 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization LOWER Preconditioner for the Schur complement formed from A11 So changing the factorization from the default FULL to LOWER did work, but changing the preconditioner from A11 to USER didn't. I've also tried to run directly from the command line using -sys_pc_fieldsplit_schur_precondition user -sys_ksp_view This works in the sense that I don't get the "WARNING! There are options you set that were not used!" message, but still ksp_view reports A11 instead of user provided matrix. Can you send a small test code, since I use this everyday here and it works. Thanks, Matt Chris dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:[email protected]<mailto:[email protected]> T +31 317 49 33 44<tel:%2B31%20317%2049%2033%2044> MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11<tel:%2B31%20317%2049%2039%2011>, F +31 317 49 32 45<tel:%2B31%20317%2049%2032%2045>, I www.marin.nl<http://www.marin.nl> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
! running with ./fieldsplitbug -ksp_view shows: ! !PC Object: 1 MPI processes ! type: fieldsplit ! FieldSplit with Schur preconditioner, factorization LOWER ! Preconditioner for the Schur complement formed from A11 ! ! despite the call to PCFieldSplitSetSchurPre. program fieldsplitbug implicit none #include <finclude/petscsys.h> #include <finclude/petscis.h> #include <finclude/petscvec.h> #include <finclude/petscmat.h> #include <finclude/petscpc.h> #include <finclude/petscksp.h> #include <finclude/petscviewer.h> PetscErrorCode :: ierr PetscInt :: n=4, numsplit=1 PetscScalar :: zero=0.0, one=1.0 Vec :: diag1,diag3,x,b Mat :: A, subA(4), myS PC :: pc KSP :: ksp,subksp(2) IS :: isg(2) call PetscInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr) ! vectors call VecCreateMPI(MPI_COMM_WORLD,n,PETSC_DECIDE,diag1,ierr); CHKERRQ(ierr) call VecSet(diag1,one,ierr); CHKERRQ(ierr) call VecCreateMPI(MPI_COMM_WORLD,3*n,PETSC_DECIDE,diag3,ierr); CHKERRQ(ierr) call VecSet(diag3,one,ierr); CHKERRQ(ierr) call VecCreateMPI(MPI_COMM_WORLD,4*n,PETSC_DECIDE,x,ierr); CHKERRQ(ierr) call VecSet(x,zero,ierr); CHKERRQ(ierr) call VecDuplicate(x,b,ierr); CHKERRQ(ierr) call VecSet(b,one,ierr); CHKERRQ(ierr) ! matrix a00 call MatCreateAIJ(MPI_COMM_WORLD,3*n,3*n,PETSC_DECIDE,PETSC_DECIDE,1,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,subA(1),ierr);CHKERRQ(ierr) call MatDiagonalSet(subA(1),diag3,INSERT_VALUES,ierr);CHKERRQ(ierr) call MatAssemblyBegin(subA(1),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) call MatAssemblyEnd(subA(1),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) ! matrix a01 call MatCreateAIJ(MPI_COMM_WORLD,3*n,n,PETSC_DECIDE,PETSC_DECIDE,0,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,subA(2),ierr);CHKERRQ(ierr) call MatAssemblyBegin(subA(2),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) call MatAssemblyEnd(subA(2),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) ! matrix a10 call MatCreateAIJ(MPI_COMM_WORLD,n,3*n,PETSC_DECIDE,PETSC_DECIDE,0,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,subA(3),ierr);CHKERRQ(ierr) call MatAssemblyBegin(subA(3),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) call MatAssemblyEnd(subA(3),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) ! matrix a11 call MatCreateAIJ(MPI_COMM_WORLD,n,n,PETSC_DECIDE,PETSC_DECIDE,1,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,subA(4),ierr);CHKERRQ(ierr) call MatDiagonalSet(subA(4),diag1,INSERT_VALUES,ierr);CHKERRQ(ierr) call MatAssemblyBegin(subA(4),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) call MatAssemblyEnd(subA(4),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) ! nested mat [a00,a01;a10,a11] call MatCreateNest(MPI_COMM_WORLD,2,PETSC_NULL_OBJECT,2,PETSC_NULL_OBJECT,subA,A,ierr);CHKERRQ(ierr) call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) ! user pc for Schur call MatCreateAIJ(MPI_COMM_WORLD,n,n,PETSC_DECIDE,PETSC_DECIDE,1,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,myS,ierr);CHKERRQ(ierr) call MatDiagonalSet(myS,diag1,INSERT_VALUES,ierr);CHKERRQ(ierr) call MatAssemblyBegin(myS,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) call MatAssemblyEnd(myS,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr) ! KSP and PC call KSPCreate(MPI_COMM_WORLD,ksp,ierr);CHKERRQ(ierr) call KSPGetPC(ksp,pc,ierr);CHKERRQ(ierr) call PCSetType(pc,PCFIELDSPLIT,ierr);CHKERRQ(ierr) call PCFieldSplitSetType(pc,PC_COMPOSITE_SCHUR,ierr);CHKERRQ(ierr) call MatNestGetISs(A,isg,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr); call PCFieldSplitSetIS(pc,"0",isg(1),ierr);CHKERRQ(ierr) call PCFieldSplitSetIS(pc,"1",isg(2),ierr);CHKERRQ(ierr) call KSPSetOperators(ksp,A,A,ierr);CHKERRQ(ierr) call KSPSetFromOptions(ksp,ierr);CHKERRQ(ierr) call KSPSetUp(ksp,ierr);CHKERRQ(ierr); ! this call succesfully overrules the default FULL factorization call PCFieldSplitSetSchurFactType(pc,PC_FIELDSPLIT_SCHUR_FACT_LOWER,ierr);CHKERRQ(ierr) ! this call should overrule the default A11 Schur pc, but doesn't seem to work call PCFieldSplitSetSchurPre(pc,PC_FIELDSPLIT_SCHUR_PRE_USER,myS,ierr);CHKERRQ(ierr) call PCFieldSplitGetSubKSP(pc,numsplit,subksp,ierr);CHKERRQ(ierr) call KSPSolve(ksp,b,x,ierr);CHKERRQ(ierr) call KSPGetSolution(ksp,x,ierr);CHKERRQ(ierr) call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr);CHKERRQ(ierr) call PetscFinalize(ierr) end program fieldsplitbug
