Dear PETSC users,

I would like to ask a question about using an external library with
MPI+OpenMP in PETSC. For example, within PETSC, I want to use MUMPS with
MPI+OpenMP. This means that if one node has 12 MPI processes and 24GB,
MUMPS uses 4 MPI processes with 6GB and each MPI process has 3 threads. To
do this, what should be done from PETSC installation? After using MUMPS in
this way, is there a way to convert from MPI+OpenMP to flat MPI? Thank you!

Regards,
Evan

Reply via email to