Emmanuel:
This is a bug in petsc. I've pushed a fix
https://gitlab.com/petsc/petsc/-/commit/fd2a003f2c07165526de5c2fa5ca4f3c85618da7
You can edit it in your petsc library, or add MatAssemblyBegin/End in your
application code until petsc-release is patched.
Thanks for reporting it and sending us
Emmanuel:
You can create a dense C with the required parallel layout without
calling MatAssemblyBegin() and MatAssemblyEnd().
Did you get error without calling these routines?
We only updated the help manu, not internal implementation. In the next
release, we'll introduce new set of API to
Hao:
I would suggest to use a parallel sparse direct solver, e.g., superlu_dist
or mumps. These solvers can take advantage of your sparse data structure.
Once it works, then you may play with other preconditioners, such as
bjacobi + lu/ilu. See