http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
On Jul 31, 2013, at 10:56 AM, Jed Brown <jedbr...@mcs.anl.gov> wrote: > wadud.m...@awe.co.uk writes: > >> Hello, >> >> I have built PETSc 3.4.2 with MVAPICH2 1.9b and my application code >> crashes when I spawn it with 16 processes; it works with 4 and 8 >> processes. This occurs at the MatAssemblyEnd subroutine when the >> non-local values are broadcast to other processes. Has anyone else had >> issues with MVAPICH2? > > Can you get a stack trace? Does it work correctly with other MPI > implementations? Does a basic PETSc example show the same problem?