Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-27 Thread 晓峰 何
Many thanks for your concrete replies. They are very helpful.

Best Wishes,
Xiaofeng

On Sep 27, 2022, at 20:01, Mark Adams mailto:mfad...@lbl.gov>> 
wrote:

Shells can be pretty hard.

Start with an easy problem with well-shaped elements and probably best to start 
with a well-supported structure (eg, not a long cantilever).

You should also configure with hypre and try that. I don't know if they deal 
with shells, but it is a well developed solver.

For GAMG, you need to set the coordinates (PCSetCoordinates) which will create 
the rigid body modes for GAMG so that it can construct the rotational RBMs.
This code has not been used for shells in a long time 
(petsc/src/ksp/pc/impls/gamg/agg.c:138), but it did once work (20 years ago in 
another code).
This code constructs the 6 rigid body modes for 3D shell problems with my 
ording (6 dof, x,y,z,xx,yy,zz).

Or you can construct the RBMs yourself and set them with 
https://petsc.org/release/docs/manualpages/Mat/MatNullSpaceCreateRigidBody.html

There is some discussion here: 
https://www2.eecs.berkeley.edu/Pubs/TechRpts/2000/CSD-00-1103.pdf

Mark


On Tue, Sep 27, 2022 at 6:20 AM 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:
Hello,

A00 comes from shell structures and discretized by FEM.

Thanks,
Xiaofeng

On Sep 27, 2022, at 17:48, Mark Adams mailto:mfad...@lbl.gov>> 
wrote:

what equations and discetizations are in A00?

On Tue, Sep 27, 2022 at 1:45 AM 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:
Hi Barry,

A00 is formed from elliptic operator.

I tried GAMG with A00, but it was extremely slow to solve the system with 
field-split preconditioner(I’m not sure I did it with the right way).

Thanks,
Xiaofeng

On Sep 26, 2022, at 23:11, Barry Smith 
mailto:bsm...@petsc.dev>> wrote:


  What is your A00 operator? ILU is almost never a good choice for large scale 
problems. If it is an elliptic operator that using a PC of gamg may work well 
for the A00 preconditioner instead of ILU.

  Barry

  For moderate size problems you can use a PC type LU for AOO to help you 
understand the best preconditioner to use for the A11 (the Schur complement 
block), once you have a good preconditioner for the A11 block you would then go 
back and determine a good preconditioner for the A00 block.

On Sep 26, 2022, at 10:08 AM, 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:

Hello Jed,

The saddle point is due to Lagrange multipliers, thus the size of A11 is much 
smaller than A00.



Best Regards,

Xiaofeng


On Sep 26, 2022, at 21:03, Jed Brown 
mailto:j...@jedbrown.org>> wrote:

Lagrange multipliers







Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-27 Thread Barry Smith

Composed preconditioners (like field split) have multiple moving parts and 
you need to "tune" them for each part separately; you cannot just run the 
entire preconditioner, get slow convergence on the entire problem and then give 
up. 

 So step one is to get a good preconditioner for the A00 block, and not 
worry about the entire fieldsplit yet (once you get good convergence on the A00 
block you can tune the Schur complement preconditioner but without good 
convergence on the A00 block it makes no sense to try to tune the Schur 
complement preconditioner). 

You can run with options to monitor convergence of the A00 block and try to 
tune for that -fieldplit_0_ksp_monitor_true residual  -fieldsplit_0_ksp_view 
-fieldsplit_0_pc_type gamg and control GAMG options with 
-fieldsplit_0_pc_gamg_* 

 As Mark said, you first need to provide the coordinate and null space 
information for the A00 block to have any hope of good performance

 
> On Sep 27, 2022, at 1:45 AM, 晓峰 何  wrote:
> 
> Hi Barry,
> 
> A00 is formed from elliptic operator. 
> 
> I tried GAMG with A00, but it was extremely slow to solve the system with 
> field-split preconditioner(I’m not sure I did it with the right way).
> 
> Thanks,
> Xiaofeng
> 
>> On Sep 26, 2022, at 23:11, Barry Smith > > wrote:
>> 
>>   
>>   What is your A00 operator? ILU is almost never a good choice for large 
>> scale problems. If it is an elliptic operator that using a PC of gamg may 
>> work well for the A00 preconditioner instead of ILU.
>> 
>>   Barry
>> 
>>   For moderate size problems you can use a PC type LU for AOO to help you 
>> understand the best preconditioner to use for the A11 (the Schur complement 
>> block), once you have a good preconditioner for the A11 block you would then 
>> go back and determine a good preconditioner for the A00 block.
>> 
>>> On Sep 26, 2022, at 10:08 AM, 晓峰 何 >> > wrote:
>>> 
>>> Hello Jed,
>>> 
>>> The saddle point is due to Lagrange multipliers, thus the size of A11 is 
>>> much smaller than A00.
>>> 
>>> 
>>> 
>>> Best Regards,
>>> 
>>> Xiaofeng
>>> 
>>> 
 On Sep 26, 2022, at 21:03, Jed Brown >>> > wrote:
 
 Lagrange multipliers
>>> 
>> 
> 



Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-27 Thread Mark Adams
Shells can be pretty hard.

Start with an easy problem with well-shaped elements and probably best to
start with a well-supported structure (eg, not a long cantilever).

You should also configure with hypre and try that. I don't know if they
deal with shells, but it is a well developed solver.

For GAMG, you need to set the coordinates (PCSetCoordinates) which will
create the rigid body modes for GAMG so that it can construct the
rotational RBMs.
This code has not been used for shells in a long time
(petsc/src/ksp/pc/impls/gamg/agg.c:138), but it did once work (20 years ago
in another code).
This code constructs the 6 rigid body modes for 3D shell problems with my
ording (6 dof, x,y,z,xx,yy,zz).

Or you can construct the RBMs yourself and set them with
https://petsc.org/release/docs/manualpages/Mat/MatNullSpaceCreateRigidBody.html

There is some discussion here:
https://www2.eecs.berkeley.edu/Pubs/TechRpts/2000/CSD-00-1103.pdf

Mark


On Tue, Sep 27, 2022 at 6:20 AM 晓峰 何  wrote:

> Hello,
>
> A00 comes from shell structures and discretized by FEM.
>
> Thanks,
> Xiaofeng
>
> On Sep 27, 2022, at 17:48, Mark Adams  wrote:
>
> what equations and discetizations are in A00?
>
> On Tue, Sep 27, 2022 at 1:45 AM 晓峰 何  wrote:
>
>> Hi Barry,
>>
>> A00 is formed from elliptic operator.
>>
>> I tried GAMG with A00, but it was extremely slow to solve the system with
>> field-split preconditioner(I’m not sure I did it with the right way).
>>
>> Thanks,
>> Xiaofeng
>>
>> On Sep 26, 2022, at 23:11, Barry Smith  wrote:
>>
>>
>>   What is your A00 operator? ILU is almost never a good choice for large
>> scale problems. If it is an elliptic operator that using a PC of gamg may
>> work well for the A00 preconditioner instead of ILU.
>>
>>   Barry
>>
>>   For moderate size problems you can use a PC type LU for AOO to help you
>> understand the best preconditioner to use for the A11 (the Schur complement
>> block), once you have a good preconditioner for the A11 block you would
>> then go back and determine a good preconditioner for the A00 block.
>>
>> On Sep 26, 2022, at 10:08 AM, 晓峰 何  wrote:
>>
>> Hello Jed,
>>
>> The saddle point is due to Lagrange multipliers, thus the size of A11 is
>> much smaller than A00.
>>
>>
>>
>> Best Regards,
>>
>> Xiaofeng
>>
>>
>> On Sep 26, 2022, at 21:03, Jed Brown  wrote:
>>
>> Lagrange multipliers
>>
>>
>>
>>
>>
>


Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-27 Thread 晓峰 何
Hello,

A00 comes from shell structures and discretized by FEM.

Thanks,
Xiaofeng

On Sep 27, 2022, at 17:48, Mark Adams mailto:mfad...@lbl.gov>> 
wrote:

what equations and discetizations are in A00?

On Tue, Sep 27, 2022 at 1:45 AM 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:
Hi Barry,

A00 is formed from elliptic operator.

I tried GAMG with A00, but it was extremely slow to solve the system with 
field-split preconditioner(I’m not sure I did it with the right way).

Thanks,
Xiaofeng

On Sep 26, 2022, at 23:11, Barry Smith 
mailto:bsm...@petsc.dev>> wrote:


  What is your A00 operator? ILU is almost never a good choice for large scale 
problems. If it is an elliptic operator that using a PC of gamg may work well 
for the A00 preconditioner instead of ILU.

  Barry

  For moderate size problems you can use a PC type LU for AOO to help you 
understand the best preconditioner to use for the A11 (the Schur complement 
block), once you have a good preconditioner for the A11 block you would then go 
back and determine a good preconditioner for the A00 block.

On Sep 26, 2022, at 10:08 AM, 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:

Hello Jed,

The saddle point is due to Lagrange multipliers, thus the size of A11 is much 
smaller than A00.



Best Regards,

Xiaofeng


On Sep 26, 2022, at 21:03, Jed Brown 
mailto:j...@jedbrown.org>> wrote:

Lagrange multipliers






Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-27 Thread Mark Adams
what equations and discetizations are in A00?

On Tue, Sep 27, 2022 at 1:45 AM 晓峰 何  wrote:

> Hi Barry,
>
> A00 is formed from elliptic operator.
>
> I tried GAMG with A00, but it was extremely slow to solve the system with
> field-split preconditioner(I’m not sure I did it with the right way).
>
> Thanks,
> Xiaofeng
>
> On Sep 26, 2022, at 23:11, Barry Smith  wrote:
>
>
>   What is your A00 operator? ILU is almost never a good choice for large
> scale problems. If it is an elliptic operator that using a PC of gamg may
> work well for the A00 preconditioner instead of ILU.
>
>   Barry
>
>   For moderate size problems you can use a PC type LU for AOO to help you
> understand the best preconditioner to use for the A11 (the Schur complement
> block), once you have a good preconditioner for the A11 block you would
> then go back and determine a good preconditioner for the A00 block.
>
> On Sep 26, 2022, at 10:08 AM, 晓峰 何  wrote:
>
> Hello Jed,
>
> The saddle point is due to Lagrange multipliers, thus the size of A11 is
> much smaller than A00.
>
>
>
> Best Regards,
>
> Xiaofeng
>
>
> On Sep 26, 2022, at 21:03, Jed Brown  wrote:
>
> Lagrange multipliers
>
>
>
>
>


Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread 晓峰 何
Hi Barry,

A00 is formed from elliptic operator.

I tried GAMG with A00, but it was extremely slow to solve the system with 
field-split preconditioner(I’m not sure I did it with the right way).

Thanks,
Xiaofeng

On Sep 26, 2022, at 23:11, Barry Smith 
mailto:bsm...@petsc.dev>> wrote:


  What is your A00 operator? ILU is almost never a good choice for large scale 
problems. If it is an elliptic operator that using a PC of gamg may work well 
for the A00 preconditioner instead of ILU.

  Barry

  For moderate size problems you can use a PC type LU for AOO to help you 
understand the best preconditioner to use for the A11 (the Schur complement 
block), once you have a good preconditioner for the A11 block you would then go 
back and determine a good preconditioner for the A00 block.

On Sep 26, 2022, at 10:08 AM, 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:

Hello Jed,

The saddle point is due to Lagrange multipliers, thus the size of A11 is much 
smaller than A00.



Best Regards,

Xiaofeng


On Sep 26, 2022, at 21:03, Jed Brown 
mailto:j...@jedbrown.org>> wrote:

Lagrange multipliers





Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread Barry Smith
  
  What is your A00 operator? ILU is almost never a good choice for large scale 
problems. If it is an elliptic operator that using a PC of gamg may work well 
for the A00 preconditioner instead of ILU.

  Barry

  For moderate size problems you can use a PC type LU for AOO to help you 
understand the best preconditioner to use for the A11 (the Schur complement 
block), once you have a good preconditioner for the A11 block you would then go 
back and determine a good preconditioner for the A00 block.

> On Sep 26, 2022, at 10:08 AM, 晓峰 何  wrote:
> 
> Hello Jed,
> 
> The saddle point is due to Lagrange multipliers, thus the size of A11 is much 
> smaller than A00.
> 
> 
> 
> Best Regards,
> 
> Xiaofeng
> 
> 
>> On Sep 26, 2022, at 21:03, Jed Brown > > wrote:
>> 
>> Lagrange multipliers
> 



Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread 晓峰 何
Hello Matt,

Many thanks for your suggestion.


BR,
Xiaofeng


On Sep 26, 2022, at 20:29, Matthew Knepley 
mailto:knep...@gmail.com>> wrote:

Another option are the PCPATCH solvers for multigrid, as shown in this paper: 
https://arxiv.org/abs/1912.08516
which I believe solves incompressible elasticity. There is an example in PETSc 
for Stokes I believe.

  Thanks,

 Matt

On Mon, Sep 26, 2022 at 5:20 AM 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:
Are there other approaches to solve this kind of systems in PETSc except for 
field-split methods?

Thanks,
Xiaofeng

On Sep 26, 2022, at 14:13, Jed Brown 
mailto:j...@jedbrown.org>> wrote:

This is the joy of factorization field-split methods. The actual Schur 
complement is dense, so we represent it implicitly. A common strategy is to 
assemble the mass matrix and drop it in the 11 block of the Pmat. You can check 
out some examples in the repository for incompressible flow (Stokes problems). 
The LSC (least squares commutator) is another option. You'll likely find that 
lumping diag(A00)^{-1} works poorly because the resulting operator behaves like 
a Laplacian rather than like a mass matrix.

晓峰 何 mailto:tlan...@hotmail.com>> writes:

If assigned a preconditioner to A11 with this cmd options:

  -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
gmres -fieldsplit_1_pc_type ilu

Then I got this error:

"Could not locate a solver type for factorization type ILU and matrix type 
schurcomplement"

How could I specify a preconditioner for A11?

BR,
Xiaofeng


On Sep 26, 2022, at 11:02, 晓峰 何 
mailto:tlan...@hotmail.com>> 
wrote:

-fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
gmres -fieldsplit_1_pc_type none



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/



Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread 晓峰 何
Hello Jed,

The saddle point is due to Lagrange multipliers, thus the size of A11 is much 
smaller than A00.



Best Regards,

Xiaofeng


On Sep 26, 2022, at 21:03, Jed Brown 
mailto:j...@jedbrown.org>> wrote:

Lagrange multipliers



Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread Jed Brown
Xiaofeng, is your saddle point due to incompressibility or other constraints 
(like Lagrange multipliers for contact or multi-point constraints)? If 
incompressibility, are you working on structured or unstructured/non-nested 
meshes?

Matthew Knepley  writes:

> Another option are the PCPATCH solvers for multigrid, as shown in this
> paper: https://arxiv.org/abs/1912.08516
> which I believe solves incompressible elasticity. There is an example in
> PETSc for Stokes I believe.
>
>   Thanks,
>
>  Matt
>
> On Mon, Sep 26, 2022 at 5:20 AM 晓峰 何  wrote:
>
>> Are there other approaches to solve this kind of systems in PETSc except
>> for field-split methods?
>>
>> Thanks,
>> Xiaofeng
>>
>> On Sep 26, 2022, at 14:13, Jed Brown  wrote:
>>
>> This is the joy of factorization field-split methods. The actual Schur
>> complement is dense, so we represent it implicitly. A common strategy is to
>> assemble the mass matrix and drop it in the 11 block of the Pmat. You can
>> check out some examples in the repository for incompressible flow (Stokes
>> problems). The LSC (least squares commutator) is another option. You'll
>> likely find that lumping diag(A00)^{-1} works poorly because the resulting
>> operator behaves like a Laplacian rather than like a mass matrix.
>>
>> 晓峰 何  writes:
>>
>> If assigned a preconditioner to A11 with this cmd options:
>>
>>   -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu
>> -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type ilu
>>
>> Then I got this error:
>>
>> "Could not locate a solver type for factorization type ILU and matrix type
>> schurcomplement"
>>
>> How could I specify a preconditioner for A11?
>>
>> BR,
>> Xiaofeng
>>
>>
>> On Sep 26, 2022, at 11:02, 晓峰 何 > mailto:tlan...@hotmail.com >> wrote:
>>
>> -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu
>> -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type none
>>
>>
>>
>
> -- 
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread Matthew Knepley
Another option are the PCPATCH solvers for multigrid, as shown in this
paper: https://arxiv.org/abs/1912.08516
which I believe solves incompressible elasticity. There is an example in
PETSc for Stokes I believe.

  Thanks,

 Matt

On Mon, Sep 26, 2022 at 5:20 AM 晓峰 何  wrote:

> Are there other approaches to solve this kind of systems in PETSc except
> for field-split methods?
>
> Thanks,
> Xiaofeng
>
> On Sep 26, 2022, at 14:13, Jed Brown  wrote:
>
> This is the joy of factorization field-split methods. The actual Schur
> complement is dense, so we represent it implicitly. A common strategy is to
> assemble the mass matrix and drop it in the 11 block of the Pmat. You can
> check out some examples in the repository for incompressible flow (Stokes
> problems). The LSC (least squares commutator) is another option. You'll
> likely find that lumping diag(A00)^{-1} works poorly because the resulting
> operator behaves like a Laplacian rather than like a mass matrix.
>
> 晓峰 何  writes:
>
> If assigned a preconditioner to A11 with this cmd options:
>
>   -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu
> -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type ilu
>
> Then I got this error:
>
> "Could not locate a solver type for factorization type ILU and matrix type
> schurcomplement"
>
> How could I specify a preconditioner for A11?
>
> BR,
> Xiaofeng
>
>
> On Sep 26, 2022, at 11:02, 晓峰 何  mailto:tlan...@hotmail.com >> wrote:
>
> -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu
> -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type none
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread 晓峰 何
Are there other approaches to solve this kind of systems in PETSc except for 
field-split methods?

Thanks,
Xiaofeng

On Sep 26, 2022, at 14:13, Jed Brown 
mailto:j...@jedbrown.org>> wrote:

This is the joy of factorization field-split methods. The actual Schur 
complement is dense, so we represent it implicitly. A common strategy is to 
assemble the mass matrix and drop it in the 11 block of the Pmat. You can check 
out some examples in the repository for incompressible flow (Stokes problems). 
The LSC (least squares commutator) is another option. You'll likely find that 
lumping diag(A00)^{-1} works poorly because the resulting operator behaves like 
a Laplacian rather than like a mass matrix.

晓峰 何 mailto:tlan...@hotmail.com>> writes:

If assigned a preconditioner to A11 with this cmd options:

  -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
gmres -fieldsplit_1_pc_type ilu

Then I got this error:

"Could not locate a solver type for factorization type ILU and matrix type 
schurcomplement"

How could I specify a preconditioner for A11?

BR,
Xiaofeng


On Sep 26, 2022, at 11:02, 晓峰 何 
mailto:tlan...@hotmail.com>> 
wrote:

-fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
gmres -fieldsplit_1_pc_type none



Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-26 Thread Jed Brown
This is the joy of factorization field-split methods. The actual Schur 
complement is dense, so we represent it implicitly. A common strategy is to 
assemble the mass matrix and drop it in the 11 block of the Pmat. You can check 
out some examples in the repository for incompressible flow (Stokes problems). 
The LSC (least squares commutator) is another option. You'll likely find that 
lumping diag(A00)^{-1} works poorly because the resulting operator behaves like 
a Laplacian rather than like a mass matrix.

晓峰 何  writes:

> If assigned a preconditioner to A11 with this cmd options:
>
>-fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu 
> -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type ilu
>
> Then I got this error:
>
> "Could not locate a solver type for factorization type ILU and matrix type 
> schurcomplement"
>
> How could I specify a preconditioner for A11?
>
> BR,
> Xiaofeng
>
>
> On Sep 26, 2022, at 11:02, 晓峰 何 
> mailto:tlan...@hotmail.com>> wrote:
>
> -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
> gmres -fieldsplit_1_pc_type none


Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-25 Thread 晓峰 何
If assigned a preconditioner to A11 with this cmd options:

   -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu 
-fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type ilu

Then I got this error:

"Could not locate a solver type for factorization type ILU and matrix type 
schurcomplement"

How could I specify a preconditioner for A11?

BR,
Xiaofeng


On Sep 26, 2022, at 11:02, 晓峰 何 
mailto:tlan...@hotmail.com>> wrote:

-fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
gmres -fieldsplit_1_pc_type none



Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-25 Thread Jed Brown
The usual issue is that you need a preconditioner for the Schur complement S = 
A11 - A01 A00^{-1} A10. For incompressible elasticity, this S is spectrally 
equivalent to a scaled mass matrix.

晓峰 何  writes:

> Hi all,
>
> I have a linear system formed from structural mechanics, and there exists 
> zero in the diagonal entries:
>
> A = (A00 A01
>   A10 A11), where A00 has inverse and the diagonal entries in A11 are all 
> zero.
>
> The GMRES method with ILU preconditioner in PETSc was carried out to solve 
> this system, and I got this error: 
>
> "PC failed due to FACTOR_NUMERIC_ZEROPIVOT"
>
> I googled and found that Field Split preconditioner should be applied to 
> solve this kind of systems. I passed -pc-type fieldsplit 
> -pc_fieldsplit_detect_saddle_point options to program, and got another error:
>
> “PC failed due to SUBPC_ERROR”
>
> After that, I tried to split the matrix by codes:
>
> IS is;
> ISCreateStride(PETSC_COMM_SELF, row number of A00, 0, 1, );
> PCFieldSplitSetIS(pc, “0”, is);
>
> IS is2;
> ISCreateStride(PETSC_COMM_SELF, row number of A11, start row of A11, 1, );
> PCFieldSplitSetIS(pc, “1”, is2);
>
> Recompiled the codes and run it with additional options:
>
>  -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu 
> -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type none
>
> To my surprise, the Krylov solver executed extremely slow. I googled and 
> found the reason is that if size of A00 is much bigger than A11(e.g. A00 has 
> 1 rows and A11 has 50 rows), then Schur complement is inefficient.
>
> Could you help me to solve this kind of systems efficiently?
>
> Best regards,
>
> Xiaofeng


[petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-25 Thread 晓峰 何
Hi all,

I have a linear system formed from structural mechanics, and there exists zero 
in the diagonal entries:

A = (A00 A01
A10 A11), where A00 has inverse and the diagonal entries in A11 are all 
zero.

The GMRES method with ILU preconditioner in PETSc was carried out to solve this 
system, and I got this error: 

"PC failed due to FACTOR_NUMERIC_ZEROPIVOT"

I googled and found that Field Split preconditioner should be applied to solve 
this kind of systems. I passed -pc-type fieldsplit 
-pc_fieldsplit_detect_saddle_point options to program, and got another error:

“PC failed due to SUBPC_ERROR”

After that, I tried to split the matrix by codes:

IS is;
ISCreateStride(PETSC_COMM_SELF, row number of A00, 0, 1, );
PCFieldSplitSetIS(pc, “0”, is);

IS is2;
ISCreateStride(PETSC_COMM_SELF, row number of A11, start row of A11, 1, );
PCFieldSplitSetIS(pc, “1”, is2);

Recompiled the codes and run it with additional options:

 -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type 
gmres -fieldsplit_1_pc_type none

To my surprise, the Krylov solver executed extremely slow. I googled and found 
the reason is that if size of A00 is much bigger than A11(e.g. A00 has 1 
rows and A11 has 50 rows), then Schur complement is inefficient.

Could you help me to solve this kind of systems efficiently?

Best regards,

Xiaofeng