" That's much better than what you had before. What option specifically did you try?"
In the raw tutorial code (step-56), the Schur matrix was approximated by the mass matrix on the pressure space. However, this is likely a poor approximation if the pressure is element-wise constant. PETSc offers several options for approximating the Schur matrix. One such option is "self", where the approximation reads S = B*diag(A)^-1*B^T. This is clearly not a bad choice, considering the exact Schur complement is S = B*A^-1*B^T. The dealii::SparseMatrix provides a mmult function to implement this matrix multiplication. To answer your question, using this "self" approximation, the number of outer iterations was reduced to 41 (Cycle 2) and 82 (Cycle 3), but the inner iterations for solving with S increased over-proportionally to 3379 (Cycle 2) and 13308 (Cycle 3). I continued using SolverCG with SparseILU, since S = B*A^-1*B^T is symmetric and positive definite. Reducing the inner iteration counts for S would make this approach competitive with the performance observed for the stable Q2-Q1 element. But SolverCG + SparseILU (or AMG) is likely a reasonable choice for solving S*y=z with S = B*diag(A)^-1*B^T, right? PETSc also offers the "selfp" option, which approximates S using Least Squares Commutators (see here <https://petsc.org/release/manualpages/PC/PCLSC/#id1650>). I am not yet familiar with the math behind it, so I can not assess whether this approach would render further improvement. Best, Simon On Wednesday, September 10, 2025 at 2:05:55 AM UTC+2 Wolfgang Bangerth wrote: > On 9/9/25 14:07, Simon wrote: > > > > I am aware that choosing an unstable element is generally not > recommended. > > However, despite not being LBB stable, the Q1-P0 element is still > commonly used > > in solid mechanics and included in commercial codes. In practice, it > performs > > quite well for my benchmark problems and the computational savings for > not > > choosing quadratic displacement elements are significant. > > Yes, that's true. As mentioned in the paper you referenced, the element is > also widely used in the geosciences. Of course, the issue with the small > eigenvalues is a problem for solvers everywhere else too. > > > > Besides the condition number, I am wondering whether the increased outer > > iterations I presented > > could stem from a poor approximation of the Schur complement, S = > B*A^-1*B^T? > > Following one of the approachs PETSc uses for saddle point systems (see > here > > <https://nam10.safelinks.protection.outlook.com/? > > > url=https%3A%2F%2Fpetsc.org%2Fmain%2Fmanualpages%2FPC%2FPCFieldSplitSetSchurPre%2F&data=05%7C02%7CWolfgang.Bangerth% > 40colostate.edu > %7Cb0a3dfe376814b2d690608ddefdc8187%7Cafb58802ff7a4bb1ab21367ff2ecfc8b%7C0%7C0%7C638930452796650916%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=4u4HhaAE03L2Xf95FFuICzDzVgqh2MndaRdZyugWM00%3D&reserved=0>), > > I tried approximating the Schur complement as > > S = B*diag(A)^-1*B^T > > and obtained the following results: > > > > * Cycle 2: 41(Outer), 434 Inner (A), 3379 Inner (S) > > * Cycle 3: 82 (Outer), 1669 Inner (A), 13308 Inner (S) > > That's much better than what you had before. What option specifically did > you try? > > > > The outer FMGRES iterations are quite reasonable, but the number of > inner > > iterations for solving > > with the Schur complement clearly increased significantly. > > Sloppy speaking, did I simply shift the conditioning issue from the > outer > > solve to the inner Schur solve? > > Using AMG for the Schur solve helped to reduce the inner iteration > counts for > > S to > > 530 (Cycle 2) and 2746 (Cycle 3). However, the total cpu time remained > similar > > to that of using sparse ilu. > > > > Given all this, do you have any recommendations for a more effective > Schur > > complement approximation > > and/or preconditioner? Ideally, this can be generalized to > incompressible > > elasticity as well. > > I don't have any good suggestions. You probably already saw that, but it's > worth reading through the "Possibilities for extensions" section of > step-22. > > I had a graduate student who spent 3 years working on finding > preconditioners > for a block system, with only moderate success. I'm currently working on a > problem where I'm also having great trouble finding decent > preconditioners, > despite trying the best ideas I have on that topic. Preconditioning block > systems is hard :-( That's why people still write papers about it, and why > the > issue is mentioned in many tutorials. > > Best > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: [email protected] > www: http://www.math.colostate.edu/~bangerth/ > > -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en --- You received this message because you are subscribed to the Google Groups "deal.II User Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion visit https://groups.google.com/d/msgid/dealii/52a5457e-e413-4cd9-a957-a407131cea14n%40googlegroups.com.
