I already used Valgrind to check for memory and as far I can tell the program was working fine. At this stage my concern is mainly about the convergence of ksp solver. I can later try compiling with petsc-dev.
Thank you for your time, Ata On Tue, Feb 1, 2011 at 3:34 PM, Barry Smith <bsmith at mcs.anl.gov> wrote: > > Suggest running with valgrind > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind to > see if there is some memory corruption. If that doesn't help suggest a build > with MPICH to see if the same or a different problem happens. > > Getting a partial error message like this is not normal and rarely seen. > > Barry > > On Feb 1, 2011, at 3:32 PM, Matthew Knepley wrote: > > > On Tue, Feb 1, 2011 at 3:27 PM, Ataollah Mesgarnejad < > amesga1 at tigers.lsu.edu> wrote: > > I compile it with openmpi and it runs on 5 cores. And thats all the error > message from PETSC. the compelete output looks like this: > > > > Something is wrong with your output gathering. It would indicate the > error, or a signal received. > > > > Matt > > > > [3]PETSC ERROR: MatHYPRE_IJMatrixCreate() line 76 in > src/dm/da/utils/mhyp.c > > [3]PETSC ERROR: PCSetUp_HYPRE() line 112 in > src/ksp/pc/impls/hypre/hypre.c > > [3]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c > > [3]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c > > [3]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c > > [3]PETSC ERROR: UStep() line 511 in PFMAT-FD.cpp > > [4]PETSC ERROR: MatHYPRE_IJMatrixCreate() line 76 in > src/dm/da/utils/mhyp.c > > [4]PETSC ERROR: PCSetUp_HYPRE() line 112 in > src/ksp/pc/impls/hypre/hypre.c > > [4]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c > > [4]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c > > [3]PETSC ERROR: main() line 95 in PFMAT-main.cpp > > [4]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c > > [4]PETSC ERROR: UStep() line 511 in PFMAT-FD.cpp > > [4]PETSC ERROR: main() line 95 in PFMAT-main.cpp > > > -------------------------------------------------------------------------- > > MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD > > with errorcode 1. > > > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. > > You may or may not see output from other processes, depending on > > exactly when Open MPI kills them. > > > -------------------------------------------------------------------------- > > > -------------------------------------------------------------------------- > > mpirun has exited due to process rank 3 with PID 3845 on > > node me-1203svr3.lsu.edu exiting without calling "finalize". This may > > have caused other processes in the application to be > > terminated by signals sent by mpirun (as reported here). > > > > Best, > > Ata > > > > On Tue, Feb 1, 2011 at 2:35 PM, Barry Smith <bsmith at mcs.anl.gov> wrote: > > > > Right preconditioner shouldn't matter to the preconditioner at all. What > is the complete error message. Does it run on one process? > > > > Barry > > > > On Feb 1, 2011, at 2:32 PM, Ataollah Mesgarnejad wrote: > > > > > Barry, > > > > > > I did as you said and now I receive these errors once I try to run the > program: > > > > > > [3]PETSC ERROR: MatHYPRE_IJMatrixCreate() line 76 in > src/dm/da/utils/mhyp.c > > > [3]PETSC ERROR: PCSetUp_HYPRE() line 112 in > src/ksp/pc/impls/hypre/hypre.c > > > [3]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c > > > [3]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c > > > [3]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c > > > [3]PETSC ERROR: UStep() line 511 in PFMAT-FD.cpp > > > [4]PETSC ERROR: MatHYPRE_IJMatrixCreate() line 76 in > src/dm/da/utils/mhyp.c > > > [4]PETSC ERROR: PCSetUp_HYPRE() line 112 in > src/ksp/pc/impls/hypre/hypre.c > > > [4]PETSC ERROR: PCSetUp() line 795 in src/ksp/pc/interface/precon.c > > > [4]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c > > > [3]PETSC ERROR: main() line 95 in PFMAT-main.cpp > > > [4]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c > > > [4]PETSC ERROR: UStep() line 511 in PFMAT-FD.cpp > > > [4]PETSC ERROR: main() line 95 in PFMAT-main.cpp > > > > > > Does right preconditiong work with HYPRE? > > > Best, > > > Ata > > > > > > On Tue, Feb 1, 2011 at 2:21 PM, Barry Smith <bsmith at mcs.anl.gov> > wrote: > > > > > > The simplest thing is to simply run with -ksp_monitor_true_residual > and see how the convergence is going in the true residual norm also. > > > > > > You can also switch to right preconditioning with gmres and then the > residual used by gmres is the true residual norm. Use > -ksp_preconditioner_side right -ksp_norm_type unpreconditioned > > > > > > Barry > > > > > > > > > > > > On Feb 1, 2011, at 2:14 PM, Ataollah Mesgarnejad wrote: > > > > > > > Dear all, > > > > > > > > I'm using Ksp gmres with boomeramg preconditioning and I suspect that > even though it converges in preconditioned norm it doesn't converge in true > norm, but as I understand KSPSetNormType gmres does not support true > residual norm? Is that correct and If it is, is there any other way to > monitor the true norm? Do I need to Introduce my own convergence test? > > > > > > > > Best, > > > > A. Mesgarnejad > > > > > > > > > > > > > > > -- > > > A. Mesgarnejad > > > PhD Student, Research Assistant > > > Mechanical Engineering Department > > > Louisiana State University > > > 2203 Patrick F. Taylor Hall > > > Baton Rouge, La 70803 > > > > > > > > > > -- > > A. Mesgarnejad > > PhD Student, Research Assistant > > Mechanical Engineering Department > > Louisiana State University > > 2203 Patrick F. Taylor Hall > > Baton Rouge, La 70803 > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > -- A. Mesgarnejad PhD Student, Research Assistant Mechanical Engineering Department Louisiana State University 2203 Patrick F. Taylor Hall Baton Rouge, La 70803 -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110201/b763d7d3/attachment-0001.htm>
