Are you using the same MPI to build both PETSc and your appliation? Satish
On Wed, 2022-02-09 at 05:21 +0100, Bojan Niceno wrote: > To whom it may concern, > > > I am working on a Fortran (2003) computational fluid dynamics solver, > which is actually quite mature, was parallelized with MPI from the > very beginning and it comes with its own suite of Krylov solvers. > Although the code is self-sustained, I am inclined to believe that it > would be better to use PETSc instead of my own home-grown solvers. > > In the attempt to do so, I have installed PETSc 3.16.4 with following > options: > > ./configure --with-debugging=yes --download-openmpi=yes --download- > fblaslapack=yes --download-metis=yes --download-parmetis=yes -- > download-cmake=yes > > on a workstation running Ubuntu 20.04 LTS. The mpif90 command which > I use to compile the code, wraps gfortran with OpenMPI, hence the > option "--download-openmpi=yes" when configuring PETSc. > > Anyhow, installation of PETSc went fine, I managed to link and run it > with my code, but I am getting the following messages during > compilation: > > Petsc_Mod.f90:18:6: > > 18 | use PetscMat, only: tMat, MAT_FINAL_ASSEMBLY > | 1 > Warning: Named COMMON block ‘mpi_fortran_bottom’ at (1) shall be of > the same size as elsewhere (4 vs 8 bytes) > > Petsc_Mod.f90 is a module I wrote for interfacing PETSc. All works, > but these messages give me a reason to worry. > > Can you tell what causes this warnings? I would guess they might > appear if one mixes OpenMPI with MPICH, but I don't think I even have > MPICH on my system. > > Please let me know what you think about it? > > Cheers, > > Bojan > > > >
