On Thu, Feb 17, 2022 at 7:01 AM Bojan Niceno < bojan.niceno.scient...@gmail.com> wrote:
> Dear all, > > I am coupling my unstructured CFD solver with PETSc. At this moment, > sequential version is working fine, but I obviously want to migrate to MPI > parallel. My code is MPI parallel since ages. > > Anyhow, as a part of the migration to parallel, I changed the matrix type > from MATSEQAIJ to MATMPIAIJ. The code compiled, but when I executed it one > processor, I received an error message that combination of matrix format > does not support BICG solver and PCILU preconditoner. I took a look at the > compatibility matrix ( > https://petsc.org/release/overview/linear_solve_table/#preconditioners) > and noticed that MATMPIAIJ supports only MKL CParadiso preconditioner which > seems to belong to Intel. > > I did some more reading and realised that I should probably continue with > MATAIJ (which should work in sequential and parallel), but I am wondering > why would there even be MATMPIAJ if it supports only one third-party > preconditioner? > 1) MATAIJ is not a concrete type, it just creates MATSEQAIJ in serial and MATMPIAIJ in parallel 2) MATMPIAIJ supports many parallel direct solvers (see the end of https://petsc.org/main/docs/manual/ksp/), including MUMPS SuperLU_dist Hypre (Euclid) CPardiso There are also parallel AMG solvers, parallel DD solvers, and Krylov solvers. The complaint you got said that a serial LU was being used with a parallel matrix type, so using AIJ is the right solution. Thanks, Matt > Cheers, > > Bojan Niceno > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>