On Fri, 2 Jul 2021, Patrick Sanan wrote:

> As you mention in [4], the proximate cause of the configure failure is this 
> link error [8]:
> 
> Naively, that looks like a problem to be resolved at the level of the C++ 
> compiler and MPI.
> 
> Unless there are wrinkles of this build process that I don't understand 
> (likely), this [6] looks non-standard to me:
> 
>       includedir="${prefix}/include"
>       ...
>       ./configure --prefix=${prefix} \
>               ...
>               -with-mpi-include="${includedir}" \
>               ...
> 
> 
> Is it possible to configure using  --with-mpi-dir, instead of the separate 
> --with-mpi-include and --with-mpi-lib commands? 

Well --with-mpi-dir is preferable if using mpicc/mpif90 etc from that location. 
Otherwise - if one really need to use native compilers [aka gcc/gfortran] - its 
appropriate to use --with-mpi-include/with-mpi-lib options.

Best to verify if the correct values are used from 'mpicc -show' [or 
equivalent] for this install

 
--with-mpi-lib="[/workspace/destdir/lib/libmpi.so,/workspace/destdir/lib/libmpifort.so]"

This list appears to be in the wrong order - but then - the order usually 
doesn't matter for shared library usage.

Satish


> 
> 
> As an aside, maybe Satish can say more, but I'm not sure if it's advisable to 
> override variables in the make command [7].
> 
> [8]   
> https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-configure-log-L7795
> [6]   
> https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-build_tarballs-jl-L45
> [7]   
> https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-build_tarballs-jl-L55
> 
> 
> > Am 02.07.2021 um 06:25 schrieb Kozdon, Jeremy (CIV) <[email protected]>:
> > 
> > I have been talking with Boris Kaus and Patrick Sanan about trying to 
> > revive the Julia PETSc interface wrappers. One of the first things to get 
> > going is to use Julia's binary builder [1] to wrap more scalar, real, and 
> > int type builds of the PETSc library; the current distribution is just 
> > Real, double, Int32. I've been working on a PR for this [2] but have been 
> > running into some build issues on some architectures [3].
> > 
> > I doubt that anyone here is an expert with Julia's binary builder system, 
> > but I was wondering if anyone who is better with the PETSc build system can 
> > see anything obvious from the configure.log [4] that might help me sort out 
> > what's going on.
> > 
> > This exact script worked on 2020-08-20 [5] to build the libraries, se 
> > something has obviously changed with either the Julia build system and/or 
> > one (or more!) of the dependency binaries.
> > 
> > For those that don't know, Julia's binary builder system essentially allows 
> > users to download binaries directly from the web for any system that the 
> > Julia Programing language distributes binaries for. So a (desktop) user can 
> > get MPI, PETSc, etc. without the headache of having to build anything from 
> > scratch; obviously on clusters you would still want to use system MPIs and 
> > what not.
> > 
> > ----
> > 
> > [1] https://github.com/JuliaPackaging/BinaryBuilder.jl
> > [2] https://github.com/JuliaPackaging/Yggdrasil/pull/3249
> > [3] 
> > https://github.com/JuliaPackaging/Yggdrasil/pull/3249#issuecomment-872698681
> > [4] 
> > https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-configure-log
> > [5] 
> > https://github.com/JuliaBinaryWrappers/PETSc_jll.jl/releases/tag/PETSc-v3.13.4%2B0
> 

Reply via email to