> On Mar 11, 2015, at 8:11 AM, Roy Stogner <royst...@ices.utexas.edu> wrote: > > > On Wed, 11 Mar 2015, Manav Bhatia wrote: > >> I seem to be facing the same issue as described here: >> http://sourceforge.net/p/libmesh/mailman/message/31297017/ >> <http://sourceforge.net/p/libmesh/mailman/message/31297017/> >> >> If I build with pthreads and tbb, the code hangs at the singleton::setup >> call. > > Do you mean Singleton::Setup::Setup() (as in the above issue, which > was fixed)? Or Singleton::setup()? Either way, could you post a > stack trace?
I meant the call through line libmesh.C:(370) to Singleton::setup(); I am working on getting the stack trace. The problem is that if I comment the “lock” line out, then the code crashes with a segmentation-fault without a stack trace. There is, obviously, something that I am not doing right. > >> Without pthreads, it seems to be crashing at the following assert for >> remote_elem. > > I didn't see an assert in your email. We'll need to see the stack > trace, too; most of the asserts in RemoteElem are of the form "this > function shouldn't ever be called" and aren't informative unless you > know which code is making the call. I was alluding towards the call immediately after the setup, which reads libmesh_assert(remote_elem); Sorry I have not been too clear and specific. > >> I noticed that the earlier discussion had some talk about a patch >> to comment out the scoped_lock calls. Is that still applicable? > > Well, if we commented out *all* the scoped_lock calls then we'd be > running into race condition errors left and right. I removed just the > one scoped_lock call that appeared to be extraneous and appeared to be > using an uninitialized mutex. > >> I am using intel 11.1 compiler with openmpi 1.4.2, if that is of relevance. > > It does rule out a couple possible problems, thanks. > > Which libMesh release (or git checkout hash) are you using? I am at the following commit: abb4c2ffde19788495882db392f61fde18d365d3 Is there a gcc vs icc thing here, perhaps? > > What's your configure line? Check your libmesh_config.h and make sure > it's detecting TBB? IIRC we don't internally *use* pthreads unless > TBB isn't found. > It was able to find TBB. Here are the concerned lines /* Flag indicating whether the library shall be compiled to use the Threading Building Blocks */ #ifndef LIBMESH_HAVE_TBB_API #define LIBMESH_HAVE_TBB_API 1 #endif /* define if the compiler supports tbb::tbb_thread */ #ifndef LIBMESH_HAVE_TBB_CXX_THREAD #define LIBMESH_HAVE_TBB_CXX_THREAD 1 #endif Following is the configure options: /* Configuration information. */ #ifndef LIBMESH_CONFIGURE_INFO #define LIBMESH_CONFIGURE_INFO "./configure '--prefix=/cavs/projects/sams/codes/raptor/libmesh/libmesh/../' '--enable-mpi' '--enable-unique-id' '--enable-dependency-tracking' ' --enable-fortran' '--enable-shared' '--enable-exceptions' '--disable-openmp' '--disable-default-comm-world' '--enable-tracefiles' '--enable-amr' '--enable-vsmoother' '--enable-p eriodic' '--enable-dirichlet' '--enable-parmesh' '--enable-nodeconstraint' '--enable-ghosted' '--enable-pfem' '--enable-ifem' '--enable-second' '--enable-xdr' '--enable-referenc e-counting' '--enable-perflog' '--enable-examples' '--enable-boost' '--disable-trilinos' '--enable-tbb' '--disable-pthreads' '--enable-sfc' '--disable-tecplot' '--disable-tecio' '--enable-metis' '--enable-parmetis' '--enable-tetgen' '--enable-triangle' '--enable-vtk' '--enable-hdf5' '--enable-libHilbert' '--enable-nanoflann' '--enable-exodus' '--enable -netcdf' '--enable-petsc' '--enable-slepc' '--with-mpi=/usr/local/mpi-intel/x86_64/openmpi-1.5.5/' '--with-metis=PETSc' '--with-hdf5=/usr/local/hdf5-1.8.11/' '--with-vtk-lib=/us r/local/vtk/x86_64/lib/vtk-5.4/' '--with-vtk-include=/usr/local/vtk/x86_64/include/vtk-5.4/' '--with-tbb=/usr/local/intel-11.1/tbb' '--with-tbb-lib=/usr/local/intel-11.1/tbb/lib /intel64' '--with-methods=opt dbg' 'CXX=mpicxx' 'CXXFLAGS=-I/cavs/projects/sams/codes/raptor/libmesh/libmesh/../../petsc/include -I/usr/local/vtk/x86_64/include/vtk-5.4/' 'LIBS= -L/usr/local/intel-11.1/tbb/lib/intel64 -L/usr/local/mpi-intel/x86_64/openmpi-1.4.2/lib -lmpi_cxx -lmpi' 'CC=mpicc' 'CFLAGS=-I/cavs/projects/sams/codes/raptor/libmesh/libmesh/.. /../petsc/include' 'FC=mpif90' 'F77=mpif90' 'PETSC_DIR=/cavs/projects/sams/codes/raptor/libmesh/libmesh/../../petsc' 'SLEPC_DIR=/cavs/projects/sams/codes/raptor/libmesh/libmesh/ ../../slepc'" #endif > Thanks, > --- > Roy ------------------------------------------------------------------------------ Dive into the World of Parallel Programming The Go Parallel Website, sponsored by Intel and developed in partnership with Slashdot Media, is your hub for all things parallel software development, from weekly thought leadership blogs to news, videos, case studies, tutorials and more. Take a look and join the conversation now. http://goparallel.sourceforge.net/ _______________________________________________ Libmesh-users mailing list Libmesh-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/libmesh-users