[petsc-users] Compilation failure of PETSc with "The procedure name of the INTERFACE block conflicts with a name in the encompassing scoping unit"

2023-10-11 Thread Richter, Roland
Hei,

following my last question I managed to configure PETSc with Intel MPI and
Intel OneAPI using the following configure-line:

 

./configure --prefix=/media/storage/local_opt/petsc
--with-scalar-type=complex --with-cc=mpiicc --with-cxx=mpiicpc
--CPPFLAGS="-fPIC -march=native -mavx2" --CXXFLAGS="-fPIC -march=native
-mavx2" --with-fc=mpiifort --with-pic=true --with-mpi=true
--with-blaslapack-dir=/opt/intel/oneapi/mkl/latest/lib/intel64/
--with-openmp=true --download-hdf5=yes --download-netcdf=yes
--download-chaco=no --download-metis=yes --download-slepc=yes
--download-suitesparse=yes --download-eigen=yes --download-parmetis=yes
--download-ptscotch=yes --download-mumps=yes --download-scalapack=yes
--download-superlu=yes --download-superlu_dist=yes --with-mkl_pardiso=1
--with-boost=1 --with-boost-dir=/media/storage/local_opt/boost
--download-opencascade=yes --with-fftw=1
--with-fftw-dir=/media/storage/local_opt/fftw3 --download-kokkos=yes
--with-mkl_sparse=1 --with-mkl_cpardiso=1 --with-mkl_sparse_optimize=1
--download-muparser=yes --download-p4est=yes --download-sowing=yes
--download-viennalcl=yes --with-zlib --force=1 --with-clean=1 --with-cuda=0

 

Now, however, compilation fails with the following error:

/home/user/Downloads/git-files/petsc/include/../src/ksp/f90-mod/ftn-auto-int
erfaces/petscpc.h90(699): error #6623: The procedure name of the INTERFACE
block conflicts with a name in the encompassing scoping unit.
[PCGASMCREATESUBDOMAINS2D]

  subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z)

-^

/home/user/Downloads/git-files/petsc/include/../src/ksp/f90-mod/ftn-auto-int
erfaces/petscpc.h90(1199): error #6623: The procedure name of the INTERFACE
block conflicts with a name in the encompassing scoping unit.
[PCASMCREATESUBDOMAINS2D]

  subroutine PCASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,z)

-^

I'm on the latest version of origin/main, but can't figure out how to fix
that issue by myself. Therefore, I'd appreciate additional insight. 

Thanks!

Regards,

Roland Richter

 



compilation_log.log
Description: Binary data


smime.p7s
Description: S/MIME cryptographic signature


Re: [petsc-users] Configuration of PETSc with Intel OneAPI and Intel MPI fails

2023-10-11 Thread Richter, Roland
Hei,
Thank you very much for the answer! I looked it up, but petsc.org seems to
be a bit unstable here, quite often I can't reach petsc.org. 
Regards,
Roland Richter

-Ursprüngliche Nachricht-
Von: Satish Balay  
Gesendet: mandag 9. oktober 2023 17:29
An: Barry Smith 
Cc: Richter, Roland ; petsc-users@mcs.anl.gov
Betreff: Re: [petsc-users] Configuration of PETSc with Intel OneAPI and
Intel MPI fails

Will note - OneAPI MPI usage is documented at
https://petsc.org/release/install/install/#mpi

Satish

On Mon, 9 Oct 2023, Barry Smith wrote:

> 
>   Instead of using the mpiicc -cc=icx style use -- with-cc=mpiicc (etc)
and 
> 
> export I_MPI_CC=icx
> export I_MPI_CXX=icpx
> export I_MPI_F90=ifx
> 
> 
> > On Oct 9, 2023, at 8:32 AM, Richter, Roland 
wrote:
> > 
> > Hei,
> > I'm currently trying to install PETSc on a server (Ubuntu 22.04) with
Intel MPI and Intel OneAPI. To combine both, I have to use f. ex. "mpiicc
-cc=icx" as C-compiler, as described by
https://stackoverflow.com/a/76362396. Therefore, I adapted the
configure-line as follow:
> >  
> > ./configure --prefix=/media/storage/local_opt/petsc
--with-scalar-type=complex --with-cc="mpiicc -cc=icx" --with-cxx="mpiicpc
-cxx=icpx" --CPPFLAGS="-fPIC -march=native -mavx2" --CXXFLAGS="-fPIC
-march=native -mavx2" --with-fc="mpiifort -fc=ifx" --with-pic=true
--with-mpi=true
--with-blaslapack-dir=/opt/intel/oneapi/mkl/latest/lib/intel64/
--with-openmp=true --download-hdf5=yes --download-netcdf=yes
--download-chaco=no --download-metis=yes --download-slepc=yes
--download-suitesparse=yes --download-eigen=yes --download-parmetis=yes
--download-ptscotch=yes --download-mumps=yes --download-scalapack=yes
--download-superlu=yes --download-superlu_dist=yes --with-mkl_pardiso=1
--with-boost=1 --with-boost-dir=/media/storage/local_opt/boost
--download-opencascade=yes --with-fftw=1
--with-fftw-dir=/media/storage/local_opt/fftw3 --download-kokkos=yes
--with-mkl_sparse=1 --with-mkl_cpardiso=1 --with-mkl_sparse_optimize=1
--download-muparser=no --download-p4est=yes --download-sowing=y
 es --download-viennalcl=yes --with-zlib --force=1 --with-clean=1
--with-cuda=1
> >  
> > The configuration, however, fails with 
> >  
> > The CMAKE_C_COMPILER:
> >  
> > mpiicc -cc=icx
> >  
> >   is not a full path and was not found in the PATH
> >  
> > for all additional modules which use a cmake-based configuration
approach (such as OPENCASCADE). How could I solve that problem?
> >  
> > Thank you!
> > Regards,
> > Roland Richter
> > 
> 
> 


smime.p7s
Description: S/MIME cryptographic signature