Jan,

I tried the new installation method and when it came to installing PETSc,
it gave me this error:

2015/01/13 06:03:35 - INFO: [package:run_job]
*******************************************************************************
2015/01/13 06:03:35 - INFO: [package:run_job]          UNABLE to CONFIGURE
with GIVEN OPTIONS    (see configure.log for details):
2015/01/13 06:03:35 - INFO: [package:run_job]
-------------------------------------------------------------------------------
2015/01/13 06:03:35 - INFO: [package:run_job] Could not find a functional
LAPACK. Run with --with-lapack-lib=<lib> to indicate the library containing
LAPACK.
2015/01/13 06:03:35 - INFO: [package:run_job]  Or --download-fblaslapack=1
to have one automatically downloaded and installed
2015/01/13 06:03:35 - INFO: [package:run_job]
*******************************************************************************
2015/01/13 06:03:35 - INFO: [package:run_job]
2015/01/13 06:03:35 - ERROR: [package:run_job] Command '[u'/bin/bash',
'_hashdist/build.sh']' returned non-zero exit status 1
2015/01/13 06:03:35 - ERROR: [package:run_job] command failed (code=1);
raising

How do I set ./configure for PETSc such that it will link to the LAPACK
that was already successfully installed?

Thanks,
Justin

On Tue, Jan 13, 2015 at 3:59 AM, Jan Blechta <[email protected]>
wrote:

> On Mon, 12 Jan 2015 18:45:13 -0600
> Justin Chang <[email protected]> wrote:
>
> > Hi all,
> >
> > So I am attempting to install FEnics on our university's RHEL 6.6
> > cluster through the dorsal scripts. I normally use FEnics using
> > Ubuntu's binary distribution so doing this kind of installation is
> > uncharted territory so bear with me :)
> >
> > I had to modify  the rhel6.platform file significantly and the final
> > list of packages i needed were:
> >
> > PACKAGES=(
> > blas
> > boost
> > lapack
> > suitesparse
> > pcre
> > swig
> > cmake
> > vtkdata
> > vtk
> > parmetis
> > scotch
> > numpy
> > trilinos
> > petsc
> > slepc
> > armadillo
> > mpfr
> > cgal
> > skip:scientificpython
> > eigen
> > fiat
> > ferari
> > ufl
> > ffc
> > viper
> > instant
> > dolfin
> > )
> >
> > Everything works until i get to installing dolfin and this is the
> > error I am getting
> >
> > -- Build files have been written to:
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir
> > make: Entering directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > make[1]: Entering directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > make[2]: Entering directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > make[2]: Leaving directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > make[2]: Entering directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > [  0%] Building CXX object
> > dolfin/CMakeFiles/dolfin.dir/generation/UnitTetrahedronMesh.cpp.o
> > In file included from /home/jchang23/FEniCS/include/petscsys.h:105,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/common/types.h:29,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/function/Function.h:34,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/ale/MeshDisplacement.h:28,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/mesh/Mesh.h:38,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/mesh/MeshPartitioning.h:35,
> >                  from
> >
> /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/generation/UnitTetrahedronMesh.cpp:22:
> > /share/apps/openmpi-1.8.3/include/mpi.h:324: error: conflicting
> > declaration ‘typedef struct ompi_communicator_t* MPI_Comm’
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/common/MPI.h:42: error:
> > ‘MPI_Comm’ has a previous declaration as ‘typedef int MPI_Comm’
> > In file included from /home/jchang23/FEniCS/include/petscsys.h:1794,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/common/types.h:29,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/function/Function.h:34,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/ale/MeshDisplacement.h:28,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/mesh/Mesh.h:38,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/mesh/MeshPartitioning.h:35,
> >                  from
> >
> /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/generation/UnitTetrahedronMesh.cpp:22:
> > /home/jchang23/FEniCS/include/petsclog.h: In function ‘PetscErrorCode
> > PetscMPITypeSizeComm(MPI_Comm, PetscLogDouble*, PetscMPIInt*,
> > ompi_datatype_t*)’:
> > /home/jchang23/FEniCS/include/petsclog.h:323: error: cast from
> > ‘void*’ to ‘MPI_Comm’ loses precision
> > /home/jchang23/FEniCS/include/petsclog.h:324: error: cast from
> > ‘void*’ to ‘MPI_Comm’ loses precision
> > In file included from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/common/types.h:29,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/function/Function.h:34,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/ale/MeshDisplacement.h:28,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/mesh/Mesh.h:38,
> >                  from
> > /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/mesh/MeshPartitioning.h:35,
> >                  from
> >
> /home/jchang23/FEniCS/src/dolfin-1.4.0/dolfin/generation/UnitTetrahedronMesh.cpp:22:
> > /home/jchang23/FEniCS/include/petscsys.h: In function ‘PetscErrorCode
> > PetscCitationsRegister(const char*, PetscBool*)’:
> > /home/jchang23/FEniCS/include/petscsys.h:2650: error: cast from
> > ‘void*’ to ‘MPI_Comm’ loses precision
> > /home/jchang23/FEniCS/include/petscsys.h:2651: error: cast from
> > ‘void*’ to ‘MPI_Comm’ loses precision
> > /home/jchang23/FEniCS/include/petscsys.h:2652: error: cast from
> > ‘void*’ to ‘MPI_Comm’ loses precision
> > make[2]: ***
> > [dolfin/CMakeFiles/dolfin.dir/generation/UnitTetrahedronMesh.cpp.o]
> > Error 1 make[2]: Leaving directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > make[1]: *** [dolfin/CMakeFiles/dolfin.dir/all] Error 2
> > make[1]: Leaving directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > make: *** [all] Error 2
> > make: Leaving directory
> > `/home/jchang23/FEniCS/src/dolfin-1.4.0/dorsal_build_dir'
> > Failure with exit status: 2
> > Exit message: There was a problem building dolfin-1.4.0.
> >
> >
> > >From the looks of it it seems I already have an existing
> > >installation of
> > OpenMPI, but my question is what do I do to resolve this?
>
> This could happened because DOLFIN did not find MPI during configure
> time and typedefed MPI_Comm to int which conflicts with MPI_Comm
> indcluded from petscsys.h. Try adding
>
> -DMPI_CXX_COMPILER:FILEPATH=/share/apps/openmpi-1.8.3/bin/mpicxx
> -DMPI_C_COMPILER:FILEPATH=/share/apps/openmpi-1.8.3/bin/mpicc
> -DMPI_Fortran_COMPILER:FILEPATH=/share/apps/openmpi-1.8.3/bin/mpif90
>
> (or other correct paths) to CONFOPTS in
> <dorsal>/FEniCS/packages/dolfin.package
>
> Check <dolfin>/cmake/modules/FindMPI.cmake for more detailed
> instrunctions how to configure DOLFIN with MPI.
>
> Note that Dorsal is going to be retired and you could succeed with new
> installation method https://bitbucket.org/fenics-project/fenics-install
>
> Jan
>
> >
> > Thanks,
> > Justin
>
>
_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support

Reply via email to