>>
>> What about a less general (but important) case: saddle point
>> problems arising from incompressible Stokes, Oseen and
>> Navier-Stokes eqs. with Schur type preconditioning. In 2D with N
>> cells and co-located variables arranged
>> as (u1,...,uN,v1,...,vN,p1,...pN) the matrix would have the form
>> [Q G, D 0] with Q a 2N-by-2N matrix, G a 2N-by-N matrix and D a
>> N-by-2N matrix. Since the variables are co-located, they share
>> the same partitioning but could have different stencils. How to use
>> the "split local space", DMComposite and MATNEST in this case?
>>
>
>If you order this way, then you don't need DMComposite or MatNest (although
>you can still make a MatNest that operates in this ordering, we just don't
>have a way to make it automatically).
>

So I made 4 matrices corresponding to the four blocks above and
assembled them in a nested matrix. Then I tried to solve it using
Schur preconditioning (see below). Apparently the matrix is still
treated as a single block. I must be misunderstanding the concept...

$ mpiexec -n 1 ./matnest-try -ksp_view -pc_type fieldsplit -pc_fieldsplit_type 
schur
  Matrix object:
    type=nest, rows=2, cols=2
    MatNest structure:
    (0,0) : prefix="a00_", type=mpiaij, rows=24, cols=24
    (0,1) : prefix="a01_", type=mpiaij, rows=24, cols=12
    (1,0) : prefix="a10_", type=mpiaij, rows=12, cols=24
    (1,1) : prefix="a11_", type=mpiaij, rows=12, cols=12
[0]PETSC ERROR: --------------------- Error Message 
------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data!
[0]PETSC ERROR: Unhandled case, must have at least two fields!
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 13:45:54 CDT 
2011
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: ./matnest-try on a linux_64b named lin0133 by cklaij Fri Jan 27 
09:25:01 2012
[0]PETSC ERROR: Libraries linked from 
/opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5/lib
[0]PETSC ERROR: Configure run at Thu Jan 26 13:44:12 2012
[0]PETSC ERROR: Configure options 
--prefix=/opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5 
--with-mpi-dir=/opt/refresco/64bit_intelv11.1_openmpi/openmpi-1.4.4 --with-x=1 
--with-mpe=0 --with-debugging=1 --with-clanguage=c++ 
--with-hypre-include=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/include
 
--with-hypre-lib=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/lib/libHYPRE.a
 --with-ml-include=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/include 
--with-ml-lib=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/lib/libml.a 
--with-blas-lapack-dir=/opt/intel/mkl
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: PCFieldSplitSetDefaults() line 319 in 
/home/CKlaij/ReFRESCO/Libraries/build/petsc-3.2-p5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0]PETSC ERROR: PCSetUp_FieldSplit() line 335 in 
/home/CKlaij/ReFRESCO/Libraries/build/petsc-3.2-p5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0]PETSC ERROR: PCSetUp() line 819 in 
/home/CKlaij/ReFRESCO/Libraries/build/petsc-3.2-p5/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 260 in 
/home/CKlaij/ReFRESCO/Libraries/build/petsc-3.2-p5/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 379 in 
/home/CKlaij/ReFRESCO/Libraries/build/petsc-3.2-p5/src/ksp/ksp/interface/itfunc.c



dr. ir. Christiaan Klaij
CFD Researcher
Research & Development
E mailto:C.Klaij at marin.nl
T +31 317 49 33 44

MARIN
2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl

Reply via email to