On Mon, Jan 26, 2015 at 1:34 PM, Jan Blechta <[email protected]> wrote:
> On Mon, 26 Jan 2015 12:57:51 +0100
> Mikael Mortensen <[email protected]> wrote:
>
>>
>> > On 26 Jan 2015, at 12:36, Jan Blechta <[email protected]>
>> > wrote:
>> >
>> > On Mon, 26 Jan 2015 12:11:18 +0100
>> > Mikael Mortensen <[email protected]> wrote:
>> >
>> >>
>> >>> On 26 Jan 2015, at 11:53, Jan Blechta <[email protected]>
>> >>> wrote:
>> >>>
>> >>> On Mon, 26 Jan 2015 11:42:42 +0100
>> >>> Mikael Mortensen <[email protected]
>> >>> <mailto:[email protected]>> wrote:
>> >>>
>> >>>> Hi,
>> >>>>
>> >>>> With recent (today) hashdist installation I run the following
>> >>>> test.py script
>> >>>>
>> >>>> from dolfin import *
>> >>>>
>> >>>> mesh = UnitSquareMesh(4, 4)
>> >>>> V = VectorFunctionSpace(mesh, 'CG', 1)
>> >>>> u = interpolate(Expression(('x[0]', 'x[1]')), V)
>> >>>> print “Hi"
>> >>>>
>> >>>>
>> >>>> [mikael@ubuntu tests (master)]$ python test.py
>> >>>> Hi
>> >>>>
>> >>>> and then it just hangs there without exiting. Same thing in
>> >>>> ipython. I am currently using host python, but I got the same
>> >>>> when I compiled with hashdist python. Anyone else experiencing
>> >>>> this weirdness? Any idea what is going on? I’m on Ubuntu 14.04
>> >>>> and I’ve done instant-clean.
>> >>>
>> >>> To debug it, you could try attaching debugger (if it is allowed on
>> >>> your system)
>> >>>
>> >>> gdb python <pid>
>> >>>
>> >>> hit ^C, and print stacktrace
>> >>>
>> >>> (gdb) bt
>> >>>
>> >>> Jan
>> >>
>> >> Thanks for the tip. Not really familiar with gdb so I tried pdb
>> >> first
>> >>
>> >> python -m pdb test_GaussDivergence.py
>> >>> /home/mikael/MySoftware/fenicstools/tests/test.py(5)<module>()
>> >> -> from dolfin import *
>> >> (Pdb) continue
>> >> Hi
>> >> The program finished and will be restarted
>> >>
>> >> and then it just hangs there. Ctrl+C does nothing.
>> >>
>> >>
>> >> With gdb I am getting somewhere though:
>> >>
>> >> gdb python
>> >> Reading symbols from python...(no debugging symbols found)...done.
>> >> (gdb) run test.py
>> >> Starting
>> >> program: /home/mikael/.hashdist/bld/profile/6axink4nal3d/bin/python
>> >> test.py process 16999 is executing new
>> >> program: /home/mikael/.hashdist/bld/python/qlvjlzdbishm/bin/python2.7
>> >> [Thread debugging using libthread_db enabled] Using host
>> >> libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
>> >> [New Thread 0x7fffdc7e3700 (LWP 17005)] Hi ^C
>> >> Program received signal SIGINT, Interrupt.
>> >> __lll_lock_wait ()
>> >> at ../nptl/sysdeps/unix/sysv/linux/x86_64/lowlevellock.S:135
>> >> 135     ../nptl/sysdeps/unix/sysv/linux/x86_64/lowlevellock.S: No
>> >> such file or directory. (gdb) bt #0  __lll_lock_wait ()
>> >> at ../nptl/sysdeps/unix/sysv/linux/x86_64/lowlevellock.S:135 #1
>> >> 0x00007ffff7bc6657 in _L_lock_909 ()
>> >> from /lib/x86_64-linux-gnu/libpthread.so.0 #2  0x00007ffff7bc6480
>> >> in __GI___pthread_mutex_lock (mutex=0x7ffff063fa10
>> >> <attribute_lock+16>) at ../nptl/pthread_mutex_lock.c:79 #3
>> >> 0x00007ffff03a140c in ompi_attr_get_c ()
>> >> from /home/mikael/.hashdist/bld/mpi/vrn4awnke2cx/lib/libmpi.so.1 #4
>> >> 0x00007ffff03cba27 in MPI_Attr_get ()
>> >> from /home/mikael/.hashdist/bld/mpi/vrn4awnke2cx/lib/libmpi.so.1 #5
>> >> 0x00007ffff07b2afb in Petsc_DelComm_Outer ()
>> >> from /home/mikael/.hashdist/bld/petsc/dfw5fwbfp6hl/lib/libpetsc.so.3.5
>> >> #6  0x00007ffff03a1e88 in ompi_attr_delete ()
>> >> from /home/mikael/.hashdist/bld/mpi/vrn4awnke2cx/lib/libmpi.so.1 #7
>> >> 0x00007ffff03cb93c in MPI_Attr_delete ()
>> >> from /home/mikael/.hashdist/bld/mpi/vrn4awnke2cx/lib/libmpi.so.1 #8
>> >> 0x00007ffff07a2baa in PetscCommDestroy ()
>> >> from /home/mikael/.hashdist/bld/petsc/dfw5fwbfp6hl/lib/libpetsc.so.3.5
>> >> #9  0x00007ffff07a52bc in PetscHeaderDestroy_Private ()
>> >> from /home/mikael/.hashdist/bld/petsc/dfw5fwbfp6hl/lib/libpetsc.so.3.5
>> >> #10 0x00007ffff081372c in ISLocalToGlobalMappingDestroy ()
>> >> from /home/mikael/.hashdist/bld/petsc/dfw5fwbfp6hl/lib/libpetsc.so.3.5
>> >> #11 0x00007ffff0819d08 in PetscLayoutDestroy ()
>> >> from /home/mikael/.hashdist/bld/petsc/dfw5fwbfp6hl/lib/libpetsc.so.3.5
>> >> #12 0x00007ffff0871718 in VecDestroy ()
>> >> from /home/mikael/.hashdist/bld/petsc/dfw5fwbfp6hl/lib/libpetsc.so.3.5
>> >> #13 0x00007ffff22bb15c in dolfin::PETScVector::~PETScVector() ()
>> >> from /home/mikael/.hashdist/bld/dolfin/wt5zll4ojwwy/lib/libdolfin.so.1.5
>> >> #14 0x00007ffff22bb229 in dolfin::PETScVector::~PETScVector() ()
>> >> from /home/mikael/.hashdist/bld/dolfin/wt5zll4ojwwy/lib/libdolfin.so.1.5
>> >> #15 0x00007ffff221c34a in dolfin::Function::~Function() ()
>> >> from /home/mikael/.hashdist/bld/dolfin/wt5zll4ojwwy/lib/libdolfin.so.1.5
>> >> #16 0x00007ffff221c4d9 in dolfin::Function::~Function() ()
>> >> from /home/mikael/.hashdist/bld/dolfin/wt5zll4ojwwy/lib/libdolfin.so.1.5
>> >> #17 0x00007ffff308f999 in
>> >> std::_Sp_counted_base<(__gnu_cxx::_Lock_policy)2>::_M_release
>> >> (this=0x19bbc60) at /usr/include/c++/4.8/bits/shared_ptr_base.h:144
>> >> #18 0x00007fffdf8093cc in _wrap_delete_Function ()
>> >> from 
>> >> /home/mikael/Software/hashstack/default/lib/python2.7/site-packages/dolfin/cpp/_function.so
>> >> #19 0x00007fffdf7fc15e in SwigPyObject_dealloc ()
>> >> from 
>> >> /home/mikael/Software/hashstack/default/lib/python2.7/site-packages/dolfin/cpp/_function.so
>> >> #20 0x00000000005392ff in ?? () #21 0x00000000004d914b in ?? () #22
>> >> 0x00000000004fdb96 in PyDict_SetItem () #23 0x000000000055a9e1 in
>> >> _PyModule_Clear () #24 0x00000000004f2ad4 in PyImport_Cleanup ()
>> >> #25 0x000000000042fa89 in Py_Finalize () #26 0x000000000046ac10 in
>> >> Py_Main ()
>> >
>> > Can you reproduce it with PETSc example
>> > http://www.mcs.anl.gov/petsc/petsc-current/src/vec/is/examples/tests/ex1.c.html
>> > which also calls ISLocalToGlobalMappingDestroy?
>> > Just navigate to <petsc>/src/vec/is/examples/tests/, type make and
>> > execute.
>> >
>> > Jan
>> >
>>
>> Problem went away with openmpi 1.6.5. Btw, is the petsc source stored
>> somewhere with hashdist?
>
> I don't know. But it would be helpful for debugging. I don't know how
> optimization flags are handled by hashdist but every FEniCS developer
> should have a build with at least
>
>    PETSc --with-debugging=1
>    DOLFIN -DCMAKE_BUILD_TYPE:STRING=Developer
>
> apart from production builds. Then it is useful having PETSc source
> in place corresponding to debugging symbols in libpetsc.so. How is
> this handled, Johannes?

The source directory is removed after the build, but you can do

  hit bdir <profile> <package> <dir>

to unpack the source into <dir>. For instance

  hit bdir default.yaml petsc tmp

The sources for PETSc will then be in the tmp directory.

You can build PETSc in debug mode by setting "debug: true" in the profile:

  petsc:
    ...
    debug: false

You can also change the build type for DOLFIN:

  dolfin:
    ...
    build_type: Developer

The fenics-install-component.sh script uses -DCMAKE_BUILD_TYPE:STRING=Developer.

Johannes

> Sorry for OT but I'd like to know before switching to new
> hashdist/fenics-developer-tools framework.
>
> Jan
>
>>
>> M
>>
>> >>
>> >> So I’m trying to google this now. I think it’s because I’m using a
>> >> newer version of openmpi than in the default hashdist script.
>> >> Default is 1.6.5, but I have compiled with 1.8.2. 1.6.5 gives me
>> >> problems with h5py, which is why I upgraded it in the first place.
>> >> So at the moment h5py works, but not fenics:-(
>> >>
>> >> M
>> >>
>> >>>
>> >>>>
>> >>>> Mikael
>> >>>>
>> >>>>
>> >>>> _______________________________________________
>> >>>> fenics-support mailing list
>> >>>> [email protected]
>> >>>> <mailto:[email protected]>
>> >>>> http://fenicsproject.org/mailman/listinfo/fenics-support
>> >>>> <http://fenicsproject.org/mailman/listinfo/fenics-support>
>> >
>>
>> _______________________________________________
>> fenics-support mailing list
>> [email protected]
>> http://fenicsproject.org/mailman/listinfo/fenics-support
>
> _______________________________________________
> fenics-support mailing list
> [email protected]
> http://fenicsproject.org/mailman/listinfo/fenics-support
_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support

Reply via email to