Dear PETSc users/developpers,

I am currently trying to use the method `MatNullSpaceCreateRigidBody` together with `PCGAMG` to efficiently precondition an elasticity solver in 2D/3D.

I have managed to make it work in serial (or with 1 MPI rank) with h-independent number of iterations (which is great), but the solver diverges in parallel.

I assume it has to do with the coordinate vector I am building the null-space with not being correctly setup. The documentation is not that clear on which nodes exactly have to be set in each partition. Does it require nodes corresponding to owned dofs, or all dofs in each partition (owned+ghost)? What ghost layout should the `Vec` have?

Any other tips about what I might be doing wrong?

Thanks,

Jordi

Reply via email to