Fabio, Did you install plapack with petsc? Hong
On Thu, 17 Sep 2009, Matthew Knepley wrote: > Give us the exact command line you use for ex123 and the error output. Send > to petsc-maint. > > Matt > > On Thu, Sep 17, 2009 at 12:37 PM, Fabio Leite Soares <fls2 at > cin.ufpe.br>wrote: > >> Hi everyone, I have the same problem and I don't know how to fix it. >> >> I need to multiply two mpi dense matrices using the BLAS3 routines. I have >> tried the MatMatMult_MPIDense_MPIDense() function but the console shows this >> message: >> >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [0]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC<http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal%5B0%5DPETSC>ERROR: >> or try >> http://valgrind.org on linux or man libgmalloc on Apple to find memory >> corruption errors >> [0]PETSC ERROR: likely location of problem given in stack below >> [0]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> [0]PETSC ERROR: INSTEAD the line number of the start of the function >> [0]PETSC ERROR: is given. >> [0]PETSC ERROR: [0] MatMPIDenseCopyToPlapack line 1028 >> src/mat/impls/dense/mpi/mpidense.c >> [0]PETSC ERROR: [0] MatMatMultNumeric_MPIDense_MPIDense line 1078 >> src/mat/impls/dense/mpi/mpidense.c >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Signal received! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.0.0, Patch 8, Fri Aug 21 14:02:12 >> CDT 2009 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./mult on a linux-gnu named hpcin08 by hpcin Thu Sep 17 >> 14:28:28 2009 >> [0]PETSC ERROR: Libraries linked from >> /home/hpcin/soft/petsc-3.0.0-p8/linux-gnu-c-debug/lib >> [0]PETSC ERROR: Configure run at Wed Sep 16 17:06:08 2009 >> [0]PETSC ERROR: Configure options --download-f-blas-lapack=1 >> --download-plapack --with-mpi-dir=/usr/local/bin/mpich2-1.1.1p1 >> --with-scalar-type=real --with-precision=double --with-shared=0 >> [0]PETSC ERROR: --------------------------[1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [1]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[1]PETSC<http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal%5B1%5DPETSC>ERROR: >> or try >> http://valgrind.org on linux or man libgmalloc on Apple to find memory >> corruption errors >> [1]PETSC ERROR: likely location of problem given in stack below >> [1]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> [1]PETSC ERROR: INSTEAD the line number of the start of the function >> [1]PETSC ERROR: is given. >> [1]PETSC ERROR: [1] MatMPIDenseCopyToPlapack line 1028 >> src/mat/impls/dense/mpi/mpidense.c >> [1]PETSC ERROR: [1] MatMatMultNumeric_MPIDense_MPIDense line 1078 >> src/mat/impls/dense/mpi/mpidense.c >> [1]PETSC ERR---------------------------------------------- >> [0]PETSC ERROR: User provided function() line 0 in unknown directory >> unknown file >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> OR: --------------------- Error Message >> ------------------------------------ >> [1]PETSC ERROR: Signal received! >> [1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: Petsc Release Version 3.0.0, Patch 8, Fri Aug 21 14:02:12 >> CDT 2009 >> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [1]PETSC ERROR: See docs/index.html for manual pages. >> [1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: ./mult on a linux-gnu named hpcin-desktop by hpcin Thu Sep >> 17 14:28:27 2009 >> [1]PETSC ERROR: Libraries linked from >> /home/hpcin/soft/petsc-3.0.0-p8/linux-gnu-c-debug/lib >> [1]PETSC ERROR: Configure run at Tue Sep 15 15:57:39 2009 >> [1]PETSC ERROR: Configure options --download-plapack=1 >> --download-f-blas-lapack=1 --with-mpi-dir=/usr/local/bin/mpich2-1.1.1p1 >> --with-scalar-type=real --with-precision=double --with-shared=0 >> [1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: User provided function() line 0 in unknown directory >> unknown file >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1 >> rank 1 in job 1 hpcin08_34697 caused collective abort of all ranks >> exit status of rank 1: return code 59 >> rank 0 in job 1 hpcin08_34697 caused collective abort of all ranks >> exit status of rank 0: return code 59 >> >> >> I tried to execute the ex123.c example and I did not succeeded to. >> >> Regards >> >> -- >> F?bio Leite Soares >> Undergraduate Student of Computing Engineering >> Centro de Inform?tica - UFPE - BRAZIL >> > > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener >
