On Mon, Aug 5, 2013 at 7:20 AM, <wadud.m...@awe.co.uk> wrote: > Hello Barry, > > Thanks for your response. Do you know how to get the entire error message? >
That has to do with how MPI handles output in your environment. You should ask your system administrator. Also consider running in the debugger. Matt > Regards, > Wadud. > > -----Original Message----- > From: Barry Smith [mailto:bsm...@mcs.anl.gov] > Sent: 01 August 2013 19:02 > To: Miah Wadud AWE > Cc: petsc-users@mcs.anl.gov > Subject: EXTERNAL: Re: [petsc-users] Matrix assembly error in PETSc > > > Please always send the ENTIRE error message, it makes it much easier > for us to deduce what is going on. > > Error code 63 is PETSC_ERR_ARG_OUTOFRANGE which presumably is generated > in MatSetValues_MPIAIJ() which means a row or column index is out of > range. But since this is called within the MatAssemblyEnd_MPIAIJ() it > should never be out of range. The most likely cause is data corruption on > values passed between processes with MPI. It is possible the error is due > to bugs in the MPI implementation or due to memory corruption elsewhere. I > would first recommend running the code with valgrind (and enormously > powerful tool) to eliminate the chance of memory corruption > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > Let us know what happens, > > Barry > > MPI 2.0 vs MPI 3.0 is likely not the issue. > > > On Aug 1, 2013, at 10:08 AM, wadud.m...@awe.co.uk wrote: > > > Hello, > > > > I am running an application code which works with 4, 8 and 18 processes > but crashes with 16 processes. I have used MPICH2 and MVAPICH2 (both > adhered to the MPI 3.0 standard) and both cause the same problem. I get the > following error message: > > > > [12]PETSC ERROR: MatSetValues_MPIAIJ() line 564 in > src/mat/impls/aij/mpi/mpiaij.c > > [12]PETSC ERROR: MatAssemblyEnd_MPIAIJ() line 680 in > src/mat/impls/aij/mpi/mpiaij.c > > [12]PETSC ERROR: MatAssemblyEnd() line 4879 in src/mat/interface/matrix.c > > > > [12] --> Error in "MatAssemblyEnd()". > > [12] --> Code: 63 > > > > However, I do not get this using the Intel MPI (which adheres to the MPI > 2.0 standard) library. Any help will be greatly appreciated. > > > > Regards, > > > > -------------------------- > > Wadud Miah > > HPC, Design and Theoretical Physics > > Direct: 0118 98 56220 > > AWE, Aldermaston, Reading, RG7 4PR > > > > > > ___________________________________________________ > ____________________________ The information in this email and in any > attachment(s) is commercial in confidence. If you are not the named > addressee(s) or if you receive this email in error then any distribution, > copying or use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at admin.internet(at) > awe.co.uk, and then delete this message from your computer. While > attachments are virus checked, AWE plc does not accept any liability in > respect of any virus which is not detected. AWE Plc Registered in England > and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR > > > > > ___________________________________________________ > ____________________________ > > The information in this email and in any attachment(s) is > commercial in confidence. If you are not the named addressee(s) > or > if you receive this email in error then any distribution, copying or > use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at > admin.internet(at)awe.co.uk, and then delete this message from > your computer. While attachments are virus checked, AWE plc > does not accept any liability in respect of any virus which is not > detected. > > AWE Plc > Registered in England and Wales > Registration No 02763902 > AWE, Aldermaston, Reading, RG7 4PR > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener