Dear colleagues, Is here anyone who have solved big sparse linear matrices using PETSC?
We have found NO performance improvement while using more and more mpi processes (1-2-3) and open-mp threads (from 1 to 72 threads). Did anyone faced to this problem? Does anyone know any possible reasons of such behaviour?
We use AMG preconditioner and GMRES solver from KSP package, as our matrix is large (from 100 000 to 1e+6 rows and columns), sparse, non-symmetric and includes both positive and negative values. But performance problems also exist while using CG solvers with symmetric matrices.
Could anyone help us to set appropriate options of the preconditioner and solver? Now we use default parameters, maybe they are not the best, but we do not know a good combination. Or maybe you could suggest any other pairs of preconditioner+solver for such tasks?
I can provide more information: the matrices that we solve, c++ script to run solving using petsc and any statistics obtained by our runs.
Thank you in advance! Best regards, Lidiia Varshavchik, Ioffe Institute, St. Petersburg, Russia
