Try running the C++ versant of the demo. This will indicate whether the issue 
is in the underlying C++ libraries or in the JIT layer.

Garth

On 11 Apr 2014, at 16:28, Mike Welland <[email protected]> wrote:

> Logged in with a different terminal and managed to execute 
> demo_hyperelasticity by copy and pasting it line by line into python. Got as 
> far as the solve command and encountered a segmentation fault:
> 
> >>> # Solve variational problem
> ... solve(F == 0, u, bcs, J=J,
> ...       form_compiler_parameters=ffc_options)
> Solving nonlinear variational problem.
> Segmentation fault (core dumped)
> 
> Tried the exact same thing again and got the first error again.... 
> 
> 
> On Thu, Apr 10, 2014 at 8:24 PM, Mike Welland <[email protected]> wrote:
> Hi, I built dolfin, imported it through python and tried to create a mesh and 
> go this:
> 
> Python 2.7.3 (default, Oct  1 2013, 02:48:58) 
> [GCC 4.1.2 20080704 (Red Hat 4.1.2-54)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
> >>> from dolfin import *
> >>> mesh = UnitCubeMesh(24, 16, 16)
> 
> python:8946 terminated with signal 7 at PC=3bc020c380 SP=7fffa8e92028.  
> Backtrace:
> /lib64/libpthread.so.0(pthread_spin_lock+0x0)[0x3bc020c380]
> /software/mvapich2-gnu-psm-1.8.1/lib/libmpich.so.3(create_2level_comm+0xb62)[0x2b3e13c9f292]
> /software/mvapich2-gnu-psm-1.8.1/lib/libmpich.so.3(MPIR_Init_thread+0x3a8)[0x2b3e13cbcee8]
> /software/mvapich2-gnu-psm-1.8.1/lib/libmpich.so.3(MPI_Init_thread+0x67)[0x2b3e13cbcf97]
> /home/mwelland/programs/git/dolfin/build/dolfin/libdolfin.so.1.3(_ZN6dolfin17SubSystemsManager8init_mpiEiPPci+0x8d)[0x2b3e155eb62d]
> /home/mwelland/programs/git/dolfin/build/dolfin/libdolfin.so.1.3(_ZN6dolfin17SubSystemsManager8init_mpiEv+0x31)[0x2b3e155eb831]
> /home/mwelland/programs/git/dolfin/build/dolfin/libdolfin.so.1.3(_ZN6dolfin3MPI4sizeEi+0xc)[0x2b3e155eb16c]
> /home/mwelland/programs/git/dolfin/build/dolfin/libdolfin.so.1.3(_ZN6dolfin3MPI11is_receiverEi+0x8)[0x2b3e155eb1b8]
> /home/mwelland/programs/git/dolfin/build/dolfin/libdolfin.so.1.3(_ZN6dolfin7BoxMesh5buildEddddddmmm+0xae)[0x2b3e15672c5e]
> /home/mwelland/programs/git/dolfin/build/dolfin/libdolfin.so.1.3(_ZN6dolfin7BoxMeshC1Eddddddmmm+0x90)[0x2b3e15673af0]
> /home/mwelland/local/lib/python2.7/site-packages/dolfin/cpp/_mesh.so(+0x85b95)[0x2b3e27834b95]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x5d22)[0x2b3e03e2baf2]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x8c9)[0x2b3e03e2daf9]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(+0x73622)[0x2b3e03db3622]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyObject_Call+0x68)[0x2b3e03d85cc8]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(+0x5630f)[0x2b3e03d9630f]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyObject_Call+0x68)[0x2b3e03d85cc8]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(+0xa6cac)[0x2b3e03de6cac]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(+0xa1e48)[0x2b3e03de1e48]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyObject_Call+0x68)[0x2b3e03d85cc8]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x10b2)[0x2b3e03e26e82]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x8c9)[0x2b3e03e2daf9]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyEval_EvalCode+0x32)[0x2b3e03e2db72]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyRun_InteractiveOneFlags+0x194)[0x2b3e03e50e24]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyRun_InteractiveLoopFlags+0x4e)[0x2b3e03e5104e]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(PyRun_AnyFileExFlags+0x4c)[0x2b3e03e5171c]
> /soft/python/2.7.3/lib/libpython2.7.so.1.0(Py_Main+0xb8a)[0x2b3e03e63b7a]
> /lib64/libc.so.6(__libc_start_main+0xfd)[0x3bbfe1ed1d]
> python[0x400639]
> 
> 
> Anyone have some advice on where to start looking for the source of the 
> error? I haven't been able to run anything in FEniCS, but python works 
> without a problem. Could it be a conflict between MPI and threading? I recall 
> the book mentioning the two were incompatible... (for now anyway).
> 
> Thanks
> 
> _______________________________________________
> fenics-support mailing list
> [email protected]
> http://fenicsproject.org/mailman/listinfo/fenics-support

_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support

Reply via email to