Dear Bruno,
Thank you very much! I am using mpi/gcc right now. I will change to
Intel mpi library.
Best regards
On Thu, Jul 23, 2020 at 9:08 PM Bruno Turcksin
wrote:
> Yuesu Jin,
>
> You don't need to compile your own PETSc but you need to use the same MPI
> library than the one that PETSc is using. There are very hard to debug
> problems that appear when PETSc and deal.II use different MPI libraries. I
> think that you want to use this MPI library
> project/cacds/apps/intelmpi/2019.0.117/intel64/lib/release/libmpi.so
> So you probably need to load a module to use Intel MPI.
>
> Best,
>
> Bruno
>
> On Thursday, July 23, 2020 at 7:47:23 PM UTC-4, yuesu jin wrote:
>>
>> Dear all,
>> I installed the Deal.II on a cluster.
>>
>> The first thing I found is that, the
>> /dealii-9.2.0/cmake/configure/configure_1_mpi.cmake
>> automatically set the DEAL_II_WITH_MPI argument as "off", therefore the
>> MPI_FOUND module cannot run. I switched it on and cmake can find the MPI
>> library.
>>
>> The second thing I found is very strange. Cmake gives the message "Could
>> not find a sufficient PETSC installation: PETSC is compiled against a
>> different MPI library than the one deal.II picked up." The PETSc library
>> is integrated in the cluster's public library, I can add it by the command
>> "module add". Does this message mean that I need to compile a new version
>> of PETSc different from the one in the public library?
>>
>> Thanks!
>> Best
>>
>>
>> **
>> -- Include
>> /home/yjin6/DEALII/dealii-9.2.0/cmake/configure/configure_3_petsc.cmake
>> -- Found PETSC_LIBRARY
>> -- Found PETSC_INCLUDE_DIR_ARCH
>> -- Found PETSC_INCLUDE_DIR_COMMON
>> -- Found PETSC_PETSCVARIABLES
>> -- Found PETSC_LIBRARY_mkl_scalapack_lp64
>> -- Found PETSC_LIBRARY_mkl_blacs_intelmpi_lp64
>> -- Found PETSC_LIBRARY_mkl_intel_lp64
>> -- Found PETSC_LIBRARY_mkl_sequential
>> -- Found PETSC_LIBRARY_mkl_core
>> -- Found PETSC_LIBRARY_umfpack
>> -- Found PETSC_LIBRARY_klu
>> -- Found PETSC_LIBRARY_cholmod
>> -- Found PETSC_LIBRARY_btf
>> -- Found PETSC_LIBRARY_ccolamd
>> -- Found PETSC_LIBRARY_colamd
>> -- Found PETSC_LIBRARY_camd
>> -- Found PETSC_LIBRARY_amd
>> -- Found PETSC_LIBRARY_suitesparseconfig
>> -- Found PETSC_LIBRARY_HYPRE
>> -- Found PETSC_LIBRARY_fftw3xc_intel_pic
>> -- Found PETSC_LIBRARY_fftw3x_cdft_lp64_pic
>> -- Found PETSC_LIBRARY_mkl_cdft_core
>> -- Found PETSC_LIBRARY_mkl_blacs_intelmpi_lp64
>> -- Found PETSC_LIBRARY_mkl_intel_lp64
>> -- Found PETSC_LIBRARY_mkl_sequential
>> -- Found PETSC_LIBRARY_mkl_core
>> -- Found PETSC_LIBRARY_mkl_intel_lp64
>> -- Found PETSC_LIBRARY_mkl_sequential
>> -- Found PETSC_LIBRARY_mkl_core
>> -- Found PETSC_LIBRARY_parmetis
>> -- Found PETSC_LIBRARY_metis
>> -- Performing Test PETSC_LIBRARY_dl
>> -- Performing Test PETSC_LIBRARY_dl - Success
>> -- Found PETSC_LIBRARY_iomp5
>> -- Performing Test PETSC_LIBRARY_pthread
>> -- Performing Test PETSC_LIBRARY_pthread - Success
>> -- Performing Test PETSC_LIBRARY_rt
>> -- Performing Test PETSC_LIBRARY_rt - Success
>> -- Found PETSC_LIBRARY_mpifort
>> -- Found PETSC_LIBRARY_mpi
>> -- Found PETSC_LIBRARY_ifport
>> -- Found PETSC_LIBRARY_ifcoremt_pic
>> -- Found PETSC_LIBRARY_imf
>> -- Found PETSC_LIBRARY_svml
>> -- Performing Test PETSC_LIBRARY_m
>> -- Performing Test PETSC_LIBRARY_m - Success
>> -- Found PETSC_LIBRARY_ipgo
>> -- Found PETSC_LIBRARY_irc
>> -- Found PETSC_LIBRARY_irc_s
>> -- Found PETSC_LIBRARY_iomp5
>> -- PETSC_VERSION: 3.10.2.0
>> -- PETSC_LIBRARIES:
>>