Question #117122 on DOLFIN changed: https://answers.launchpad.net/dolfin/+question/117122
Frantisek Fridrich posted a new comment: Hello Garth. Thank you for response. I run the following demos. They work OK. elasticity, cpp stokes, stabilized, cpp stokes, TaylorHood, cpp lift-drag, cpp poisson, cpp hyperelasticity, cpp The following demos rises an error. elastodynamics, python elasticity, python stokes, stabilized, python stokes, TaylorHood, python lift-drag, python poisson, python hyperelasticity, python Calling FFC just-in-time (JIT) compiler, this may take some time. Calling FFC just-in-time (JIT) compiler, this may take some time. Lift: -14.742218 Drag: 57.550887 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Signal received! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun 4 15:34:52 CDT 2010 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Unknown Name on a linux-gnu named juniper by fridrich Sat Jul 10 16:06:21 2010 [0]PETSC ERROR: Libraries linked from /home/rose/OpenMPI/openmpi-1.4.2/utils/petsc-3.1-p3/lib [0]PETSC ERROR: Configure run at Thu Jul 8 19:49:50 2010 [0]PETSC ERROR: Configure options --prefix=/home/rose/OpenMPI/openmpi-1.4.2/utils/petsc-3.1-p3 --with-external-packages-dir=/home/rose/OpenMPI/openmpi-1.4.2/utilsRepo/petsc-3.1-p3/petscUtilsRepo --PETSC_ARCH=linux-gnu --PETSC_DIR=/home/rose/OpenMPI/openmpi-1.4.2/utilsRepo/petsc-3.1-p3/petsc-3.1-p3 --with-clanguage=c++ --with-c-support=yes --with-shared=yes --with-large-file-io=yes --CFLAGS= -Wall -march=opteron -m64 -O2 -fno-reorder-blocks -fno-reorder-functions -pipe -fPIC --CXXFLAGS= -Wall -march=opteron -m64 -O2 -fno-reorder-blocks -fno-reorder-functions -pipe -fPIC --FFLAGS= -Wall -march=opteron -m64 -O2 -fno-reorder-blocks -fno-reorder-functions -pipe -fPIC --with-ar=ar --AR_FLAGS=cr --with-ranlib=ranlib --COPTFLAGS= -O2 -fno-reorder-blocks -fno-reorder-functions --CXXOPTFLAGS= -O2 -fno-reorder-blocks -fno-reorder-functions --FOPTFLAGS= -O2 -fno-reorder-blocks -fno-reorder-functions --with-mpi-dir=/home/rose/OpenMPI/openmpi-1.4.2/install --with-mpi-shared=yes --with-spooles=yes --download-spooles=yes --with-blas-lapack-dir=/home/rose/OpenMPI/openmpi-1.4.2/utils/atlas-3.9.24/lib --with-blacs=yes --download-blacs=yes --with-parmetis=yes --with-parmetis-dir=/home/rose/OpenMPI/openmpi-1.4.2/utils/ParMetis-3.1.1 --with-scalapack=yes --download-scalapack=yes --with-mumps=yes --download-mumps=yes --with-hypre=yes --with-hypre-dir=/home/rose/OpenMPI/openmpi-1.4.2/utils/hypre-2.6.0b --with-umfpack=yes --download-umfpack=yes [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 59. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- The demo elastodynamics, cpp does not work in my instalation. Best regards. Frantisek You received this question notification because you are a member of DOLFIN Team, which is an answer contact for DOLFIN. _______________________________________________ Mailing list: https://launchpad.net/~dolfin Post to : dolfin@lists.launchpad.net Unsubscribe : https://launchpad.net/~dolfin More help : https://help.launchpad.net/ListHelp