To give more concreteness to my question, here is what our PETSc install
script looks like. It is pretty much a configure make make install.
# module --force purge
# module load compilers/intel mpi/openmpi libs/mkl/11.1
apps/cmake/2.8.12 apps/buildtools apps/devtools mpi/openmpi/1.8.8
# cd /software6/src
# NAME=petsc
# VERSION=3.7.2
# SRCDIR=${NAME}-${VERSION}
# COMPILER=intel
# MPI=openmpi1.8.8
# PREFIX=/software6/libs/${NAME}/${VERSION}_${COMPILER}_${MPI}_patched
# ARCHIVE=${NAME}-${VERSION}-build-${COMPILER}-${MPI}.tar.xz
#
# curl -O
http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/${NAME}-lite-${VERSION}.tar.gz
# tar xvfz ${NAME}-lite-${VERSION}.tar.gz
# cd ${SRCDIR}
# export CFLAGS="-O3 -xHost -mkl -fPIC -m64 -no-diag-message-catalog"
# export FFLAGS="-O3 -xHost -mkl -fPIC -m64 -no-diag-message-catalog"
# export MPIDIR=$(dirname $(dirname $(which mpiexec)))
# Required to build hypre, which expects a GNU AR
# unset AR
# ./config/configure.py PETSC_ARCH=linux-gnu-intel \
CFLAGS="$CFLAGS" FFLAGS="$FFLAGS" \
--prefix=$PREFIX \
--with-x=0 \
--with-mpi-compilers=1 \
--with-mpi-dir=$MPIDIR \
--known-mpi-shared-libraries=1 \
--with-debugging=no \
--with-shared-libraries=1 \
--with-blas-lapack-dir=$MKLROOT/lib/intel64 \
--with-scalapack=1 \
--with-scalapack-include=$MKLROOT/include \
--with-scalapack-lib="-lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64" \
--with-mkl_pardiso=1 \
--with-mkl_pardiso-dir=$MKLROOT \
--download-mumps=yes \
--download-ptscotch=yes \
--download-superlu=yes \
--download-superlu_dist=yes \
--download-parmetis=yes \
--download-metis=yes \
--download-ml=yes \
--download-suitesparse=yes \
--download-hypre=yes |& tee configure.log
# make PETSC_DIR=/software6/src/${SRCDIR} PETSC_ARCH=linux-gnu-intel all
|& tee make.log
# make PETSC_DIR=/software6/src/${SRCDIR} PETSC_ARCH=linux-gnu-intel
install |& tee make-install.log
# make PETSC_DIR=$PREFIX test |& tee make-test.log
# chmod -R g+w ${PREFIX}; chmod g+w ${PREFIX}/..
# cd ..
# tar cfJ ${ARCHIVE} ${SRCDIR} && xz -t ${ARCHIVE} && rm -rf ${SRCDIR}
Maxime
On 17-02-21 13:31, Maxime Boissonneault wrote:
Hi,
I am looking at installing PETSc through EasyBuild. I am surprised to
see that the EasyBlock relies heavily on other recipes. My personal
experience with PETSc is that one is best to stick with whatever
version the authors of PETSc decided was best for each of the packages.
Therefore, on my system, PETSc depends only on MKL and OpenMPI.
For every other package, I use the download config options so that
PETSc fetches whatever version of the packages it needs.
--download-mumps --download-ptscotch --download-superlu
--download-superlu_dist --download-parmetis --download-metis
--download-ml --download-suitesparse --download-hypre
Is this contrary to other people's experience ?