Hi Roy and Cody,

Sorry for the delay in answering… I had some problems with my HD and I needed 
to reinstall the system.

I can’t say if any other example fails since I’m still stuck with this problem.

I have tried Clang and I can’t even build the library. I receive the following 
error:

/Applications/Xcode.app/Contents/Developer/usr/bin/make  all-am
make[3]: Nothing to be done for `all-am'.
make[2]: Nothing to be done for `all-am'.
  CXX      src/base/libmesh_dbg_la-dof_map.lo
  CXX      src/base/libmesh_dbg_la-dof_map_constraints.lo
  CXX      src/base/libmesh_dbg_la-dof_object.lo
  CXX      src/base/libmesh_dbg_la-libmesh.lo
src/base/libmesh.C:40:10: fatal error: 'omp.h' file not found
#include <omp.h>
         ^
1 error generated.
make[1]: *** [src/base/libmesh_dbg_la-libmesh.lo] Error 1
make: *** [all-recursive] Error 1


I will try to make the other examples and see what happens. I will also try 
MPICH2 instead of Open-MPI.

Currently I can’t paste the gdb output here because I am not able to build the 
code neither with gcc or clang.

Best regards,

Daniel.



> On Jan 26, 2015, at 13:23, Cody Permann <[email protected]> wrote:
> 
> Are there any other examples that are crashing? Just last week I found a 
> repeatable problem that crashes or hangs everytime I run it on OS X Yosemite. 
> Unfortunately it's a relatively complex problem with lots of different pieces 
> of code including the use of SuperLU. We tried the same problem on a stack 
> using MPICH2 instead of Open-MPI and I haven't gotten the problem to crash 
> since! Since your issue is a serial run, it may be unrelated. Have you tried 
> using Clang? It's a much better compiler on OS X, no need for Mac Ports.
> 
> Cody
> 
> On Sun Jan 25 2015 at 6:56:36 PM Daniel Vasconcelos <[email protected] 
> <mailto:[email protected]>> wrote:
> Hi,
> 
> I have recently upgraded from Mac OS X Maverick to Yosemite but so far I have 
> had a lot of trouble to build libMesh on the system. I am using MacPorts to 
> install the packages I need such as GCC, Open-MPI, PETSc and VTK.
> 
> I configured the master version of libMesh with the following command (in 
> attachment is the config log):
> 
> ./configure \
>     --prefix=/Users/dmva/Development/lib/libmesh-0.9.4 \
>     --with-tbb=/opt/local \
>     --with-eigen-include=/opt/local/include/eigen3 \
>     --with-glpk-include=/opt/local/include \
>     --with-glpk-lib=/opt/local/lib  \
>     --with-vtk-include=/opt/local/include/vtk-6.1 \
>     --with-vtk-lib=/opt/local/lib \
>     --with-slpec-include=/opt/local/include \
>     --with-slepc-lib=/opt/local/lib/slepc/lib  \
>     --enable-mpi \
>     --enable-perflog \
>     --enable-dirichlet \
>     --enable-periodic \
>     --enable-nodeconstraint \
>     --enable-blocked-storage \
>     --enable-petsc \
>     --enable-slepc \
>     --enable-parmesh \
>     --enable-vtk \
>     --enable-examples \
>     --disable-trilinos \
>     --disable-hdf5 \
>     --disable-netcdf-4 \
>     --disable-netcdf \
>     --disable-tecplot \
>     --disable-tecio \
>     METHOD=dbg
> 
> When I run make check I receive an error on the example adjoints_ex1. See 
> below:
> 
> [0]PETSC ERROR: 
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see 
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC 
> <http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC> ERROR: 
> or try http://valgrind.org <http://valgrind.org/> on GNU/linux and Apple Mac 
> OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames 
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.5.2, unknown
> [0]PETSC ERROR: 
> /Users/dmva/Development/lib/libmesh/examples/adjoints/adjoints_ex1/.libs/example-opt
>  on a arch-macports named MacBook-Pro-de-Daniel.local by dmva Sun Jan 25 
> 20:51:29 2015
> [0]PETSC ERROR: Configure options --prefix=/opt/local 
> --prefix=/opt/local/lib/petsc --with-valgrind=0 --with-shared-libraries 
> --with-c2html-dir=/opt/local --with-x=0 
> --with-blas-lapack-lib=/System/Library/Frameworks/Accelerate.framework/Versions/Current/Accelerate
>  --with-hwloc-dir=/opt/local --with-debugging=1 
> CC=/opt/local/bin/mpicc-openmpi-mp CXX=/opt/local/bin/mpicxx-openmpi-mp 
> FC=/opt/local/bin/mpif90-openmpi-mp F77=/opt/local/bin/mpif90-openmpi-mp 
> F90=/opt/local/bin/mpif90-openmpi-mp COPTFLAGS=-Os CXXOPTFLAGS=-Os 
> FOPTFLAGS=-Os LDFLAGS="-L/opt/local/lib -Wl,-headerpad_max_install_names" 
> CPPFLAGS=-I/opt/local/include CFLAGS="-Os -arch x86_64" CXXFLAGS=-Os 
> FFLAGS=-Os FCFLAGS=-Os F90FLAGS=-Os PETSC_ARCH=arch-macports 
> --with-mpiexec=mpiexec-openmpi-mp
> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
> 
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> 
> Any idea of what is going on? I have used the same setup on a Linux machine 
> and it is working perfectly.
> 
> Best Regards,
> 
> Daniel.                                           
> ------------------------------------------------------------------------------
> Dive into the World of Parallel Programming. The Go Parallel Website,
> sponsored by Intel and developed in partnership with Slashdot Media, is your
> hub for all things parallel software development, from weekly thought
> leadership blogs to news, videos, case studies, tutorials and more. Take a
> look and join the conversation now. http://goparallel.sourceforge.net/ 
> <http://goparallel.sourceforge.net/>_______________________________________________
> Libmesh-users mailing list
> [email protected] 
> <mailto:[email protected]>
> https://lists.sourceforge.net/lists/listinfo/libmesh-users 
> <https://lists.sourceforge.net/lists/listinfo/libmesh-users>
------------------------------------------------------------------------------
Dive into the World of Parallel Programming. The Go Parallel Website,
sponsored by Intel and developed in partnership with Slashdot Media, is your
hub for all things parallel software development, from weekly thought
leadership blogs to news, videos, case studies, tutorials and more. Take a
look and join the conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to