On Mon, 2011-05-09 at 11:35 -0500, Barry Smith wrote: > The code is very slow for one of two reasons > > 1) the matrix preallocation is not correct and it is spending a great deal of > time in the MatSetValues() calls. To check this run the code with -info and > grep the output for malloc. My guess is that you are not preallocating for > the diagonal and hence the preallocation is not enough. See > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#efficient-assembly >
I don't think that this is the source of the problem. Filling can be more efficient (maybe?) but as I observed, the most time consuming part is when I call KSPSolve. > 2) the convergence of the solver is very slow. Run with -ksp_monitor and see > how many iterations it is taking to converge. Is it thousands? If so, you > will need to select a better preconditioner or use a direct solver -pc_type > lu. > A direct solver? Never! :) I am escaping from time complexity of direct solvers, otherwise I already have a LAPACK based code that works well. > Barry > > > > On May 9, 2011, at 11:02 AM, Danesh Daroui wrote: > > > > > Hi again, > > > > The issue is solved and it was due to a bug in my code. But now the > > problem is that PETSc is extremely slow when I try to solve even a small > > equation. I fill the matrix and right-hand-side and call the solver as > > below: > > > > ierr=KSPCreate(PETSC_COMM_WORLD, &ksp); > > ierr=KSPSetOperators(ksp, Mp, Mp, DIFFERENT_NONZERO_PATTERN); > > ierr=KSPSetTolerances(ksp, 1.e-2/Msize, 1.e-50, PETSC_DEFAULT, > > PETSC_DEFAULT); > > ierr=KSPSetFromOptions(ksp); > > ierr=KSPSolve(ksp, bp, xp); > > > > The Msize is the size of right-hand-side vector. I use C++ and am > > running PETSc in sequential mode so no MPI is used, but anyway, it > > shouldn't be that slow! Am I right or I am missing something? > > > > Thanks, > > > > Danesh > > > > > > > > > > On Sun, 2011-05-08 at 14:41 +0200, dan at ltu.se wrote: > >> Hi, > >> > >> I checked my code and there is no zero on the diagonal. I also modified > >> the code > >> so if there is any zero on the diagonal it will be stored anyway. After > >> this > >> change the problem still exists! I am not sure if I have called PETSc > >> functions > >> correctly or in correct sequence. But anyway, it returns the error: > >> > >> [0]PETSC ERROR: Object is in wrong state! > >> > >> > >> that and the rest of the error message is exactly as I posted before. Any > >> idea > >> to solve this problem? > >> > >> Thanks, > >> > >> D. > >> > >> > >> > >> Quoting Barry Smith <bsmith at mcs.anl.gov>: > >> > >>> > >>> Danesh, > >>> > >>> Some of the PETSc solvers (like the default ILU) require that you put > >>> entries on all the diagonal locations of the matrix, even if they are > >>> zero. > >>> So you should explicitly put a 0 on the diagonal locations that have 0 and > >>> the issue will go away. > >>> > >>> Barry > >>>
