As Jed points out, most of the time (~98%) is spent in your code rather than in 
the PETSc solver. Clearly, you need to profile your own source code and figure 
out where the most time is spent. You can do this using the tools for user code 
profiling (PetscLogEvent and PetscLogStage) in PETSc. Read Sections 11.2 and 
11.3 of the User's Manual for more information on PetscLogEvent and 
PetscLogStage and also look at examples such as 
http://www.mcs.anl.gov/petsc/petsc-current/src/vec/vec/examples/tutorials/ex5.c.html
 
http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex2.c.html
 
http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex46.c.html
 Shri ----- Original Message -----
> On Fri, Mar 16, 2012 at 14:52, < Nan.Jia at dartmouth.edu > wrote:
> > Dear PETSc Group,
> > I am tuning the efficiency of my PETSc code for a while, but get
> > very
> > little progress. So can anyone help me to analysis the log? Any
> > suggestions will be appreciated.
> > My problem is time dependent. At every time step, two about 6000 by
> > 6000 sparse matrices need to be solved, which come from a Poisson
> > equation. I use both sequential and parallel AIJ format to store
> > matrices, but the performances are both not very good.
> 1. You need to heed the huge warning
> ##########################################################
> # #
> # WARNING!!! #
> # #
> # This code was compiled with a debugging option, #
> # To get timing results run config/configure.py #
> # using --with-debugging=no, the performance will #
> # be generally two or three times faster. #
> # #
> ##########################################################
> 2. You only spend 18 of 880 seconds in the solver. What do you want?
> 3. The problem is too small to get significant parallel speedup.
> http://www.mcs.anl.gov/petsc/documentation/faq.html#slowerparallel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120316/6fc6fe8d/attachment.htm>

Reply via email to