> On Mar 12, 2024, at 11: 54 PM, adigitoleo (Leon) <adigitoleo@ posteo. net> wrote: > >> You need to ./configure PETSc for HDF5 using >> >>> --with-fortran-bindings=0 --with-mpi-dir=/usr --download-hdf5 >>
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
 
ZjQcmQRYFpfptBannerEnd

> On Mar 12, 2024, at 11:54 PM, adigitoleo (Leon) <[email protected]> wrote:
> 
>>   You need to ./configure PETSc for HDF5 using
>> 
>>> --with-fortran-bindings=0 --with-mpi-dir=/usr --download-hdf5
>> 
> 
> Thanks, this has worked. I assumed PETSc would just pick up the HDF5
> library I already had on my system but perhaps that requires
> --with-hdf5-dir=/usr or something similar? Would this HDF5 library need
> to be configured for MPI as well?

  This could work if that HDF5 is configured for MPI. We prefer the --download-load option since it ensures the appropriate version of HDF5 is built with the same compilers and compile options as PETSc. A previously installed version of HDF5 is often incompabible in some way.


> 
> The underworld3 test suite is mostly passing but I do get a handful of
> failures coming from
> 
>    petsc4py.PETSc.SNES.getConvergedReason()
> 
> giving -3 instead of the expected 0. But that's more a question for
> underworld devs.
> 
> Leon

Reply via email to