Just discovered this problem ourselves with 32 bit compiles. You likely need to pass into ./configure the flags --with-cc="gcc -malign-double" --cxx="g++ -malign-double". Nvcc secretly uses that option of its compiles so we need to use it for all the compilers.
Barry On Dec 10, 2010, at 5:33 AM, Jakub Pola wrote: > After typing --with-cusp=1 and --with-thrust=1 library was compiled > successfully but when I want to make tests i got following error: > What could be the reason of that? > > I have GTX 480 with drivers ver. 260 > Ubuntu 10.10 32bit system with 2GB ram. > Core duo processor 2.2GHz. > > > Test.log: > > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI > process > See http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC > ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find > memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the > function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] VecCUDACopyFromGPU line 188 > src/vec/vec/impls/seq/seqcuda/veccuda.cu > [0]PETSC ERROR: [0] VecGetArray line 226 > src/vec/vec/impls/mpi//home/kuba/External/petsc-dev/include/private/vecimpl.h > [0]PETSC ERROR: [0] VecCreateGhostWithArray line 581 > src/vec/vec/impls/mpi/pbvec.c > [0]PETSC ERROR: [0] VecCreateGhost line 661 > src/vec/vec/impls/mpi/pbvec.c > [0]PETSC ERROR: [0] MatFDColoringCreate_SeqAIJ line 21 > src/mat/impls/aij/seq/fdaij.c > [0]PETSC ERROR: [0] MatFDColoringCreate line 376 > src/mat/matfd/fdmatrix.c > [0]PETSC ERROR: [0] DMMGSetSNES line 562 src/snes/utils/damgsnes.c > [0]PETSC ERROR: [0] DMMGSetSNESLocal_Private line 933 > src/snes/utils/damgsnes.c > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development HG revision: > 488e1fcaa13db132861c12416293551e6e00b14e HG Date: Thu Dec 09 20:23:16 > 2010 +0100 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./ex19 on a arch-linu named desktop by kuba Fri Dec 10 > 08:36:02 2010 > [0]PETSC ERROR: Libraries linked > from /home/kuba/External/petsc-dev/arch-linux-gnu-c-debug/lib > [0]PETSC ERROR: Configure run at Fri Dec 10 08:05:40 2010 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran > --download-f-blas-lapack=1 --download-mpich=1 --with-cuda=1 > --with-debug=no --with-cusp=1 --with-thrust=1 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory > unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [cli_0]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1) > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI > processes > See http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html > [0]PETSC ERROR: [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [1]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrindor > see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC > ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find > memory corruption errors > [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS > X to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [1]PETSC ERROR: likely location of problem given in stack below > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the > function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] VecCUDACopyFromGPU line 188 > src/vec/vec/impls/seq/seqcuda/veccuda.cu > [0]PETSC ERROR: [0] VecGetArray line 226 > src/vec/vec/impls/mpi//home/kuba/External/petsc-dev/include/private/vecimpl.h > [0]PETSC ERROR: [0] VecCreateGhostWithArray line 581 > src/vec/vec/impls/mpi/pbvec.c > [0]PETSC ERROR: [0] VecCreateGhost line 661 > src/vec/vec/impls/mpi/pbvec.c > [0]PETSC ERROR: [0] MatFDColoringCreate_MPIAIJ line 24 > src/mat/impls/aij/mpi/fdmpiaij.c > [0]PETSC ERROR: [0] MatFDColoringCreate line 376 > src/mat/matfd/fdmatrix.c > [0]PETSC ERROR: [0] DMMGSetSNES line 562 src/snes/utils/damgsnes.c > [0]PETSC ERROR: [0] DMMGSetSNESLocal_Private line 933 > src/snes/utils/damgsnes.c > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development HG revision: > 488e1fcaa13db132861c12416293551e6e00b14e HG Date: Thu Dec 09 20:23:16 > 2010 +0100 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./ex19 on a arch-linu named desktop by kuba Fri Dec 10 > 08:36:03 2010 > [0]PETSC ERROR: Libraries linked > from /home/kuba/External/petsc-dev/arch-linux-gnu-c-debug/lib > [0]PETSC ERROR: Configure run at Fri Dec 10 08:05:40 2010 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran > --download-f-blas-lapack=1 --download-mpich=1 --with-cuda=1 > --with-debug=no --with-cusp=1 --with-thrust=1 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory > unknown file > [1]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - > process 0 > [cli_0]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > Note: The EXACT line numbers in the stack are not available, > [1]PETSC ERROR: INSTEAD the line number of the start of the > function > [1]PETSC ERROR: is given. > [1]PETSC ERROR: [1] VecCUDACopyFromGPU line 188 > src/vec/vec/impls/seq/seqcuda/veccuda.cu > [1]PETSC ERROR: [1] VecGetArray line 226 > src/vec/vec/impls/mpi//home/kuba/External/petsc-dev/include/private/vecimpl.h > [1]PETSC ERROR: [1] VecCreateGhostWithArray line 581 > src/vec/vec/impls/mpi/pbvec.c > [1]PETSC ERROR: [1] VecCreateGhost line 661 > src/vec/vec/impls/mpi/pbvec.c > [1]PETSC ERROR: [1] MatFDColoringCreate_MPIAIJ line 24 > src/mat/impls/aij/mpi/fdmpiaij.c > [1]PETSC ERROR: [1] MatFDColoringCreate line 376 > src/mat/matfd/fdmatrix.c > [1]PETSC ERROR: [1] DMMGSetSNES line 562 src/snes/utils/damgsnes.c > [1]PETSC ERROR: [1] DMMGSetSNESLocal_Private line 933 > src/snes/utils/damgsnes.c > [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [1]PETSC ERROR: Signal received! > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: Petsc Development HG revision: > 488e1fcaa13db132861c12416293551e6e00b14e HG Date: Thu Dec 09 20:23:16 > 2010 +0100 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: ./ex19 on a arch-linu named desktop by kuba Fri Dec 10 > 08:36:03 2010 > [1]PETSC ERROR: Libraries linked > from /home/kuba/External/petsc-dev/arch-linux-gnu-c-debug/lib > [1]PETSC ERROR: Configure run at Fri Dec 10 08:05:40 2010 > [1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran > --download-f-blas-lapack=1 --download-mpich=1 --with-cuda=1 > --with-debug=no --with-cusp=1 --with-thrust=1 > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: User provided function() line 0 in unknown directory > unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1 > [cli_1]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1 > APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1) > Error running Fortran example src/snes/examples/tutorials/ex5f with 1 > MPI process > See http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC > ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find > memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the > function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] VecCUDACopyFromGPU line 188 > src/vec/vec/impls/seq/seqcuda/veccuda.cu > [0]PETSC ERROR: [0] VecGetArray line 226 > src/vec/vec/interface/ftn-custom//home/kuba/External/petsc-dev/include/private/vecimpl.h > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development HG revision: > 488e1fcaa13db132861c12416293551e6e00b14e HG Date: Thu Dec 09 20:23:16 > 2010 +0100 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./ex5f on a arch-linu named desktop by kuba Fri Dec 10 > 08:36:09 2010 > [0]PETSC ERROR: Libraries linked > from /home/kuba/External/petsc-dev/arch-linux-gnu-c-debug/lib > [0]PETSC ERROR: Configure run at Fri Dec 10 08:05:40 2010 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran > --download-f-blas-lapack=1 --download-mpich=1 --with-cuda=1 > --with-debug=no --with-cusp=1 --with-thrust=1 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory > unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [cli_0]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > APPLICATION TERMINATED WITH THE EXIT STRING: Hangup (signal 1) > Completed test examples > > Dnia 2010-12-10, pi? o godzinie 00:38 +0100, Jed Brown pisze: >> On Fri, Dec 10, 2010 at 00:07, Jakub Pola <jakub.pola at gmail.com> >> wrote: >> I downloaded Petsc 3.1.0 as well as thrust 1.4.0 and cusp >> 0.2.0 >> >> As I said before, you need petsc-dev for this, CUDA support is not in >> 3.1. >> >> >> http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html#Obtaining > >
