Re: [petsc-users] Spack build and ptscotch

2024-04-24 Thread Satish Balay via petsc-users
This is the complexity with maintaining dependencies (and dependencies
of dependencies), and different build systems

- Its not easy to keep the "defaults" in both builds exactly the same.
- And its not easy to expose all "variants" or keep the same variants in both 
builds.
- And each pkg has its own issues that prevents some combinations to
  work or not [or tested combinations vs untested].

This e-mail query has multiple things:

- understand "why" the current impl of [spack, petsc] build tools are the way 
they are.
- if they can be improved
- and build use cases that you need working
- [and subsequently your code working]

Addressing them all is not easy - so lets stick with what you need to make 
progress.

For one - we recommend using latest petsc version [i.e 3.21 - not 3.19] - any 
fixes we have will address the current release.

> - spack: ptscotch will always be built without parmetis wrappers, can't turn 
> on

diff --git a/var/spack/repos/builtin/packages/petsc/package.py 
b/var/spack/repos/builtin/packages/petsc/package.py
index b7b1d86b15..ae27ba4c4e 100644
--- a/var/spack/repos/builtin/packages/petsc/package.py
+++ b/var/spack/repos/builtin/packages/petsc/package.py
@@ -268,9 +268,7 @@ def check_fortran_compiler(self):
 depends_on("metis@5:~int64", when="@3.8:+metis~int64")
 depends_on("metis@5:+int64", when="@3.8:+metis+int64")
 
-# PTScotch: Currently disable Parmetis wrapper, this means
-# nested disection won't be available thought PTScotch
-depends_on("scotch+esmumps~metis+mpi", when="+ptscotch")
+depends_on("scotch+esmumps+mpi", when="+ptscotch")
 depends_on("scotch+int64", when="+ptscotch+int64")
 
 depends_on("hdf5@:1.10+mpi", when="@:3.12+hdf5+mpi")

Now you can try:

spack install petsc~metis+ptscotch ^scotch+metis
vs
spack install petsc~metis+ptscotch ^scotch~metis [~metis is the default for 
scotch]

Note the following comment in 
spack/var/spack/repos/builtin/packages/scotch/package.py


# Vendored dependency of METIS/ParMETIS conflicts with standard
# installations
conflicts("metis", when="+metis")
conflicts("parmetis", when="+metis")
<

> - classical: ptscotch will always be built with parmetis wrappers, can't seem 
> to turn off

Looks like spack uses cmake build of ptscotch. PETSc uses Makefile interface. 
It likely doesn't support turning off metis wrappers [without hacks].

So you might either need to hack scotch build via petsc - or just install it 
separately - and use it with petsc.

I see  an effort to migrate scotch build in petsc to cmake

https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7242/__;!!G_uCfscf7eWS!dL00pokNVI6oaNk_chaSyfI1zWFeTgYA9jbRW6n9YT73s51VwLBuXYc-MAJWEKXr8uBgEFrmhFQ_VJOSlvzW6OA$
 
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7495/__;!!G_uCfscf7eWS!dL00pokNVI6oaNk_chaSyfI1zWFeTgYA9jbRW6n9YT73s51VwLBuXYc-MAJWEKXr8uBgEFrmhFQ_VJOS6OuSrPs$
 

Satish

On Wed, 24 Apr 2024, Daniel Stone wrote:

> Hi PETSc community,
> 
> I've been looking at using Spack to build PETSc, in particular I need to
> disable the default metis/parmetis dependencies and use PTScotch instead,
> for our software.
> I've had quite a bit of trouble with this - it seems like something in the
> resulting build of our simulator ends up badly optimised and an mpi
> bottleneck, when I build against
> PETSc built with Spack.
> 
> I've been trying to track this down, and noticed this in the PETSc Spack
> build recipe:
> 
> # PTScotch: Currently disable Parmetis wrapper, this means
> # nested disection won't be available thought PTScotch
> depends_on("scotch+esmumps~metis+mpi", when="+ptscotch")
> depends_on("scotch+int64", when="+ptscotch+int64")
> 
> 
> Sure enough - when I compare the build with Spack and a traditional build
> with ./configure etc, I see that, in the traditional build, Scotch is
> always built with the parmetis wrapper,
> but not in the Scotch build. In fact, I'm not sure how to turn off the
> parmetis wrapper option for scotch, in the case of a traditional build
> (i.e. there doesn't seem to be a flag in the
> configure script for it) - which would be a very useful test for me (I can
> of course do similar experiments by doing a classical build of petsc
> against ptscotch built separately without the
> wrappers, etc - will try that).
> 
> Does anyone know why the parmetis wrapper is always disabled in the spack
> build options? Is there something about Spack that would prevent it from
> working? It's notable - but I might
> be missing it - that there's no warning that there's a difference in the
> way ptscotch is built between the spack and classical builds:
> - classical: ptscotch will always be built with parmetis wrappers, can't
> seem to turn off
> - spack: ptscotch will always be built without parmetis wrappers, can't
> turn on
> 
> Any insight at all would be great, I'm new to Spack and am not super
> familiar with the logic that goes 

Re: [petsc-users] Problem with NVIDIA compiler and OpenACC

2024-04-05 Thread Satish Balay via petsc-users
Or you can skip fortran - if you are not using PETSc from it [or any external 
package that requires it], but you would need cxx for cuda

--with-fc=0 --download-f2cblaslapack --with-cxx=0 --with-cudac=0

or

--with-fc=0 --download-f2cblaslapack --with-cudac=nvcc LIBS=-lstdc++

Satish

On Fri, 5 Apr 2024, Satish Balay wrote:

> >>>
> Executing: mpifort  -o /tmp/petsc-nopi85m9/config.compilers/conftest  -v   
> -KPIC -O2 -g /tmp/petsc-nopi85m9/config.compilers/conftest.o
> stdout:
> Export 
> NVCOMPILER=/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7
> Export PGI=/software/sse2/tetralith_el9/manual/nvhpc/23.7
> /software/sse2/generic/manual/ssetools/v1.9.5/wrappers/ld /usr/lib64/crt1.o 
> /usr/lib64/crti.o 
> /software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib/trace_init.o
>  /usr/lib/gcc/x86_64-redhat-linux/11//crtbegin.o 
> /software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib/f90main.o
>  --eh-frame-hdr -m elf_x86_64 -dynamic-linker /lib64/ld-linux-x86-64.so.2 -T 
> /software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib/nvhpc.ld
>  
> -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
>  -L/software/sse2/tetralith_el9/manual/FFTW/3.3.10/nv23.7/hpc1/lib 
> -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nvshmem/lib
>  
> -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nccl/lib
>  
> -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
>  
> -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/math_libs/lib64
>  -L/soft
 ware/sse
 2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/extras/qd/lib
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/extras/CUPTI/lib64
 -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/lib64 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib
 -L/usr/lib64 -L/usr/lib/gcc/x86_64-redhat-linux/11/ 
/tmp/petsc-nopi85m9/config.compilers/conftest.o -rpath 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -rpath 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib 
-o /tmp/petsc-nopi85m9/config.compilers/conftest 
-L/usr/lib/gcc/x86_64-redhat-linux/11//../../../../lib64 -lnvf -lnvomp -ldl 
--as-needed -lnvhpcatm -latomic --no-as-needed -lpthread -lnvcpumath -lnsnvc 
-lnvc -lrt -lpthread -lgcc -lc -lgcc_s -lm /us
 r/lib/gc
 c/x86_64-redhat-linux/11//crtend.o /usr/lib64/crtn.o
> 
>   compilers: Libraries needed to link Fortran code with the C linker: 
> ['-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib',
>  
> '-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib',
>  
> '-Wl,-rpath,/software/sse2/tetralith_el9/manual/FFTW/3.3.10/nv23.7/hpc1/lib', 
> '-L/software/sse2/tetralith_el9/manual/FFTW/3.3.10/nv23.7/hpc1/lib', 
> '-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nvshmem/lib',
>  
> '-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nvshmem/lib',
>  
> '-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nccl/lib',
>  
> '-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nccl/lib',
>  
> '-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/math_libs/lib64',
>  
> '-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/math_libs/lib64',
>  '-Wl,-rpath,/sof
 tware/ss
 e2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib', 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/extras/qd/lib',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/extras/qd/lib',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/extras/CUPTI/lib64',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/extras/CUPTI/lib64',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/lib64',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/lib64',
 '-Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/11', 
'-L/usr/lib/gcc/x86_64-redhat-linux/11', 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib',
 '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lmpi', 
'-Wl,-rpath,/
 software
 /sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23..7/compilers/lib', 
'-lnvf', '-lnvomp', '-ldl', '-lnvhpcatm', '-latomic', '-lpthread', 
'-lnvcpumath', 

Re: [petsc-users] Problem with NVIDIA compiler and OpenACC

2024-04-05 Thread Satish Balay via petsc-users
>>>
Executing: mpifort  -o /tmp/petsc-nopi85m9/config.compilers/conftest  -v   
-KPIC -O2 -g /tmp/petsc-nopi85m9/config.compilers/conftest.o
stdout:
Export 
NVCOMPILER=/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7
Export PGI=/software/sse2/tetralith_el9/manual/nvhpc/23.7
/software/sse2/generic/manual/ssetools/v1.9.5/wrappers/ld /usr/lib64/crt1.o 
/usr/lib64/crti.o 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib/trace_init.o
 /usr/lib/gcc/x86_64-redhat-linux/11//crtbegin.o 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib/f90main.o
 --eh-frame-hdr -m elf_x86_64 -dynamic-linker /lib64/ld-linux-x86-64.so.2 -T 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib/nvhpc.ld
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
 -L/software/sse2/tetralith_el9/manual/FFTW/3.3.10/nv23.7/hpc1/lib 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nvshmem/lib
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nccl/lib
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/math_libs/lib64
 -L/softwa
 re/sse2/
 tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/extras/qd/lib
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/extras/CUPTI/lib64
 -L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/lib64 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib
 -L/usr/lib64 -L/usr/lib/gcc/x86_64-redhat-linux/11/ 
/tmp/petsc-nopi85m9/config.compilers/conftest.o -rpath 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -rpath 
/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib 
-o /tmp/petsc-nopi85m9/config.compilers/conftest 
-L/usr/lib/gcc/x86_64-redhat-linux/11//../../../../lib64 -lnvf -lnvomp -ldl 
--as-needed -lnvhpcatm -latomic --no-as-needed -lpthread -lnvcpumath -lnsnvc 
-lnvc -lrt -lpthread -lgcc -lc -lgcc_s -lm /usr/
 lib/gcc/
 x86_64-redhat-linux/11//crtend.o /usr/lib64/crtn.o

  compilers: Libraries needed to link Fortran code with the C linker: 
['-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib',
 '-Wl,-rpath,/software/sse2/tetralith_el9/manual/FFTW/3.3.10/nv23.7/hpc1/lib', 
'-L/software/sse2/tetralith_el9/manual/FFTW/3.3.10/nv23.7/hpc1/lib', 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nvshmem/lib',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nvshmem/lib',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nccl/lib',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/nccl/lib',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/math_libs/lib64',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/math_libs/lib64',
 '-Wl,-rpath,/softw
 are/sse2
 /tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib', 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/extras/qd/lib',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/extras/qd/lib',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/extras/CUPTI/lib64',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/extras/CUPTI/lib64',
 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/lib64',
 
'-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/cuda/lib64',
 '-Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/11', 
'-L/usr/lib/gcc/x86_64-redhat-linux/11', 
'-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib',
 '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lmpi', 
'-Wl,-rpath,/so
 ftware/s
 se2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/compilers/lib', '-lnvf', 
'-lnvomp', '-ldl', '-lnvhpcatm', '-latomic', '-lpthread', '-lnvcpumath', 
'-lnsnvc', '-lnvc', '-lrt', '-lgcc_s', '-lm']


PETSC_WITH_EXTERNAL_LIB = 
-Wl,-rpath,/proj/nsc/users/bramkamp/petsc_install/petsc_barry_fix_nvclib_no_cuda/lib
 -L/proj/nsc/users/bramkamp/petsc_install/petsc_barry_fix_nvclib_no_cuda/lib 
-Wl,-rpath,/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
 
-L/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7/comm_libs/mpi/lib
 

Re: [petsc-users] Problem with NVIDIA compiler and OpenACC

2024-04-04 Thread Satish Balay via petsc-users

On Thu, 4 Apr 2024, Frank Bramkamp wrote:

> Dear PETSC Team,
> 
> I found the following problem:
> I compile petsc 3.20.5 with Nvidia compiler 23.7.
> 
> 
> I use a pretty standard configuration, including
> 
> --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpifort COPTFLAGS="-O2 -g" 
> CXXOPTFLAGS="-O2 -g" FOPTFLAGS="-O2 -g"  --with-debugging=0 --with-log=1 
> --download-fblaslapack --with-cuda=0
> 
> I exclude cuda, since I was not sure if the problem was cuda related. 

Can you try using (to exclude cuda): --with-cudac=0

> 
> 
> The problem is now, if I have s simple fortran program where I link the petsc 
> library, but I actually do not use petsc in that program
> (Just for testing). I want to use OpenACC directives in my program, e.g. 
> !$acc parallel loop .
> The problem is now, as soon I link with the petsc library, the openacc 
> commands do not work anymore.
> It seems that openacc is not initialised and hence it cannot find a GPU.
> 
> The problem seems that you link with -lnvc.
> In “petscvariables” => PETSC_WITH_EXTERNAL_LIB you include “-lnvc”.
> If I take this out, then openacc works. With “-lnvc” something gets messed up.
> 
> The problem is also discussed here:
> https://urldefense.us/v3/__https://forums.developer.nvidia.com/t/failed-cuda-device-detection-when-explicitly-linking-libnvc/203225/1__;!!G_uCfscf7eWS!dlXNyKBzSbximQ13OXxwO506OF71yRM_H5KEnarqXE75D6Vg-ePZr2u6SJ5V3YpRETatvb9pMOUVmpyN0-19SFlbug$
>   
>   >
> 
> My understanding is that libnvc is more a runtime library that does not need 
> to be included by the linker.
> Not sure if there is a specific reason to include libnvc (I am not so 
> familiar what this library does).
> 
> If I take out -lnvc from “petscvariables”, then my program with openacc works 
> as expected. I did not try any more realistic program that includes petsc.
> 
> 
> 
> 2)
> When compiling petsc with cuda support, I also found that in the petsc 
> library the library libnvJitLink.so.12
> Is not found. On my system this library is in $CUDA_ROOT/lib64
> I am not sure where this library is on your system ?! 

Hm - good if you can send configure.log for this. configure attempts '$CC -v' 
to determine the link libraries to get c/c++/fortran compatibility libraries. 
But it can grab other libraries that the compilers are using internally here.

To avoid this - you can explicitly list these libraries to configure. For ex: 
for gcc/g++/gfortran

./configure CC=gcc CXX=g++ FC=gfortran LIBS="-lgfortran -lstdc++"

Satish

> 
> 
> Thanks a lot, Frank Bramkamp
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-03 Thread Satish Balay via petsc-users
With xcode-15.3 and branch "barry/2024-04-03/fix-chaco-modern-c/release" from 
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7433__;!!G_uCfscf7eWS!YJPSyG4qeGbCKYRp9y16HJgjw7AOrQ0mL0QWb_XcKYZ17UwK2GtURGMpkyi4TctAY-8XqSvQUFmyCQFNnKy75fI$
  [and a patched openmpi tarball to remove -Wl,-commons,use_dylibs] the 
following works for me.

Satish



petsc@mpro petsc.x % ./configure --download-bison --download-chaco 
--download-ctetgen --download-eigen --download-fftw --download-hdf5 
--download-hpddm --download-hwloc --download-hypre --download-libpng 
--download-metis --download-mmg --download-mumps --download-netcdf 
--download-openblas --download-openblas-make-options="'USE_THREAD=0 
USE_LOCKING=1 USE_OPENMP=0'" --download-p4est --download-parmmg 
--download-pnetcdf --download-pragmatic --download-ptscotch 
--download-scalapack --download-slepc --download-suitesparse 
--download-superlu_dist --download-tetgen --download-triangle --with-c2html=0 
--with-debugging=1 --with-fortran-bindings=0 --with-shared-libraries=1 
--with-x=0 --with-zlib 
--download-openmpi=https://urldefense.us/v3/__https://web.cels.anl.gov/projects/petsc/download/externalpackages/openmpi-5.0.2-xcode15.tar.gz__;!!G_uCfscf7eWS!YJPSyG4qeGbCKYRp9y16HJgjw7AOrQ0mL0QWb_XcKYZ17UwK2GtURGMpkyi4TctAY-8XqSvQUFmyCQFNvoG1gVM$
  --download-pastix && make && make check

  CC arch-darwin-c-debug/obj/src/lme/interface/lmesolve.o
 CLINKER arch-darwin-c-debug/lib/libslepc.3.21.0.dylib
DSYMUTIL arch-darwin-c-debug/lib/libslepc.3.21.0.dylib
Now to install the library do:
make 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc 
PETSC_DIR=/Users/petsc/petsc.x install
=
*** Installing SLEPc ***
*** Installing SLEPc at prefix location: 
/Users/petsc/petsc.x/arch-darwin-c-debug  ***

Install complete.
Now to check if the libraries are working do (in current directory):
make SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug 
PETSC_DIR=/Users/petsc/petsc.x PETSC_ARCH=arch-darwin-c-debug check

/usr/bin/make --no-print-directory -f makefile PETSC_ARCH=arch-darwin-c-debug 
PETSC_DIR=/Users/petsc/petsc.x 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc 
install-builtafterslepc
/usr/bin/make --no-print-directory -f makefile PETSC_ARCH=arch-darwin-c-debug 
PETSC_DIR=/Users/petsc/petsc.x 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc 
slepc4py-install
make[6]: Nothing to be done for `slepc4py-install'.
*** Building and installing HPDDM ***
=
Now to check if the libraries are working do:
make PETSC_DIR=/Users/petsc/petsc.x PETSC_ARCH=arch-darwin-c-debug check
=
Running PETSc check examples to verify correct installation
Using PETSC_DIR=/Users/petsc/petsc.x and PETSC_ARCH=arch-darwin-c-debug
C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes
C/C++ example src/snes/tutorials/ex19 run successfully with HYPRE
C/C++ example src/snes/tutorials/ex19 run successfully with MUMPS
C/C++ example src/snes/tutorials/ex19 run successfully with SuiteSparse
C/C++ example src/snes/tutorials/ex19 run successfully with SuperLU_DIST
C/C++ example src/vec/vec/tests/ex47 run successfully with HDF5
Running SLEPc check examples to verify correct installation
Using 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc, 
PETSC_DIR=/Users/petsc/petsc.x, and PETSC_ARCH=arch-darwin-c-debug
C/C++ example src/eps/tests/test10 run successfully with 1 MPI process
C/C++ example src/eps/tests/test10 run successfully with 2 MPI processes
Completed SLEPc check examples
Completed PETSc check examples
petsc@mpro petsc.x % clang --version
Apple clang version 15.0.0 (clang-1500.3.9.4)
Target: arm64-apple-darwin23.4.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
petsc@mpro petsc.x % 


On Tue, 2 Apr 2024, Zongze Yang wrote:

> Thank you for the suggestion.
> 
> I'd like to share some test results using the current Xcode. When I added the 
> flag `LDFLAGS=-Wl,-ld_classic` and configured PETSc with OpenMPI, the tests 
> with the latest Xcode seemed okay, except for some link warnings. The 
> configure 
> command is
> ```
> ./configure \
> PETSC_ARCH=arch-darwin-c-debug-openmpi \
> LDFLAGS=-Wl,-ld_classic \
> 
> --download-openmpi=https://urldefense.us/v3/__https://download.open-mpi.org/release/open-mpi/v5.0/openmpi-5.0.3rc1.tar.bz2__;!!G_uCfscf7eWS!eGiVH2meEkLSEHvkY6Y-m7U1wPG4ZDxHod7lLZI3HTu6itzNEDm7n3cz4GNly925EEHvVRnyNQYn2aAt0ewiXz99$
>  
> \
> --download-mumps --download-scalapack \
> --with-clean \
> && make && make check
> ```


Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-01 Thread Satish Balay via petsc-users
On Mon, 1 Apr 2024, Zongze Yang wrote:

> Thank you for your update.
> 
> I found some links that suggest this issue is related to the Apple linker, 
> which is causing problems with Fortran linking.
> 
> 1. 
> https://urldefense.us/v3/__https://github.com/open-mpi/ompi/issues/12427__;!!G_uCfscf7eWS!bHY4uqpTfwl0jKopATQs3gw--TZSBmDp0Lb1gzDBeEu4ZB_zTph-jfw49yIr3jvPx0YEhQbk1PjYbGYVpjjms6Wb$
>  
> 2. 
> https://urldefense.us/v3/__https://x.com/science_dot/status/1768667417553547635?s=46__;!!G_uCfscf7eWS!bHY4uqpTfwl0jKopATQs3gw--TZSBmDp0Lb1gzDBeEu4ZB_zTph-jfw49yIr3jvPx0YEhQbk1PjYbGYVptASYXS2$
>   

https://urldefense.us/v3/__https://github.com/Homebrew/homebrew-core/issues/162714__;!!G_uCfscf7eWS!chekWa3R3JhHB1MVv33Oqj9fPFhbnx9sm7cm7Lk5-n7PHicsVkY0n7XoSkWmk259VLNjTzEus6xBpRL20MmLo1g$
  recommends "downgrade CLT (or xcode?) to 15.1"

Satish


Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-01 Thread Satish Balay via petsc-users
On Mon, 1 Apr 2024, Zongze Yang wrote:

> 
> I noticed this in the config.log of OpenMPI:
> ```
> configure:30230: checking to see if mpifort compiler needs additional linker 
> flags
> configure:30247: gfortran -o conftest -fPIC -ffree-line-length-none 
> -ffree-line-length-0 -Wno-lto-type-mismatch -g -O0 -fallow-argument-mismatch  
> -Wl,-flat_namespace -Wl,-commons,use_dylibs  conftest.f90  >&5
> ld: warning: -commons use_dylibs is no longer supported, using error 
> treatment instead
> configure:30247: $? = 0
> configure:30299: result: -Wl,-commons,use_dylibs
> ```
> So, I find it odd that this flag isn't picked up on your platform, as it only 
> checked the exit value.

I get:

configure:30247: gfortran -o conftest -fPIC -ffree-line-length-none 
-ffree-line-length-0 -Wno-lto-type-mismatch -g -O0 -fallow-argument-mismatch  
-Wl,-flat_namespace -Wl,-commons,use_dylibs  conftest.f90  >&5
ld: unknown options: -commons 
collect2: error: ld returned 1 exit status
configure:30247: $? = 1
configure: failed program was:
| program test
| integer :: i
| end program
configure:30299: result: none

Note, I have and older xcode-15/CLT version:

petsc@npro ~ % clang --version
Apple clang version 15.0.0 (clang-1500.1.0.2.5)
Target: arm64-apple-darwin23.3.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin

Satish


Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-01 Thread Satish Balay via petsc-users
On Sun, 31 Mar 2024, Zongze Yang wrote:
> > ---
> > petsc@npro petsc % ./configure --download-bison --download-chaco 
> > --download-ctetgen --download-eigen --download-fftw --download-hdf5 
> > --download-hpddm --download-hwloc --download-hypre --download-libpng 
> > --download-metis --download-mmg --download-mumps --download-netcdf 
> > --download-openblas
>  --download-openblas-make-options="'USE_THREAD=0 USE_LOCKING=1 USE_OPENMP=0'" 
> --download-p4est --download-parmmg --download-pnetcdf --download-pragmatic 
> --download-ptscotch --download-scalapack --download-slepc 
> --download-suitesparse --download-superlu_dist --download-tetgen 
> --download-tri
> angle --with-c2html=0 --with-debugging=1 --with-fortran-bindings=0 
> --with-shared-libraries=1 --with-x=0 --with-zlib 
> --download-openmpi=https://urldefense.us/v3/__https://download.open-mpi.org/release/open-mpi/v5.0/openmpi-5.0.3rc1.tar.bz2__;!!G_uCfscf7eWS!aCLPUhLfLFG5UwNlUWgGGhXZlw905gJwDd
> AryIrltDIXJcdmOP6Is44FzBVrY5ndwzmIqMhI515mnNjTuHoR-tzq$ 
> --download-pastix=https://urldefense.us/v3/__https://web.cels.anl.gov/projects/petsc/download/externalpackages/pastix_5.2.3-p1.tar.bz2__;!!G_uCfscf7eWS!aCLPUhLfLFG5UwNlUWgGGhXZlw905gJwDdAryIrltDIXJcdmOP6Is44FzBVrY5ndwzmIqMhI515mnNjTuA
> fJ49xl$ && make && make check
> 
> There's an error encountered during configuration with the above options:
> ```
> TESTING: FortranMPICheck from 
> config.packages.MPI(config/BuildSystem/config/packages/MPI.py:676)
> *
>UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> details):
> -
>Fortran error! mpi_init() could not be located!
> *
> ```
> Please refer to the attached file for further information.

So I'm getting:

>>
*** Fortran compiler
checking whether the compiler supports GNU Fortran... yes
checking whether gfortran accepts -g... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking if Fortran compiler works... yes
checking for extra arguments to build a shared library... impossible -- -static
checking for gfortran warnings flags... none
checking for Fortran flag to compile .f files... none
checking for Fortran flag to compile .f90 files... none
checking if Fortran compilers preprocess .F90 files without additional flag... 
yes
checking to see if Fortran compilers need additional linker flags... 
-Wl,-flat_namespace
checking  external symbol convention... single underscore
checking if C and Fortran are link compatible... yes
checking to see if Fortran compiler likes the C++ exception flags... skipped 
(no C++ exceptions flags)
checking to see if mpifort compiler needs additional linker flags... none


However you are getting:


*** Fortran compiler
checking whether the compiler supports GNU Fortran... yes
checking whether gfortran accepts -g... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking if Fortran compiler works... yes
checking for extra arguments to build a shared library... impossible -- -static
checking for gfortran warnings flags... none
checking for Fortran flag to compile .f files... none
checking for Fortran flag to compile .f90 files... none
checking if Fortran compilers preprocess .F90 files without additional flag... 
yes
checking to see if Fortran compilers need additional linker flags... 
-Wl,-flat_namespace
checking  external symbol convention... single underscore
checking if C and Fortran are link compatible... yes
checking to see if Fortran compiler likes the C++ exception flags... skipped 
(no C++ exceptions flags)
checking to see if mpifort compiler needs additional linker flags... 
-Wl,-commons,use_dylibs


So gfortran [or ld from this newer xcode?] is behaving differently - and 
openmpi is picking up and using this broken/unsupported option - and likely 
triggering subsequent errors.

>>>
ld: warning: -commons use_dylibs is no longer supported, using error treatment 
instead
ld: common symbol '_mpi_fortran_argv_null_' from 
'/private/var/folders/tf/v4zjvtw12yb3tszk813gmnvwgn/T/petsc-xyn64q55/config.libraries/conftest.o'
 conflicts with definition from dylib '_mpi_fortran_argv_null_' from 
'/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/lib/libmpi_usempif08.40.dylib'
<<<

Or perhaps openmpi configure is affected by this new warning that this newer 
xcode spews
>>>
ld: warning: duplicate -rpath 
'/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current/gcc' ignored
<<<

I'm not sure what to suggest here [other than using Linux - and 

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-03-30 Thread Satish Balay via petsc-users
I'll just note - I can reproduce with:

petsc@npro petsc.x % ./configure --download-mpich --download-mumps 
--download-scalapack && make && make check 

And then - the following work fine for me:

petsc@npro petsc.x % ./configure --download-mpich --download-mumps 
--download-scalapack COPTFLAGS=-O0 FOPTFLAGS=-O0 LDFLAGS=-Wl,-ld_classic && 
make && make check

CLINKER arch-darwin-c-debug/lib/libpetsc.3.021.0.dylib
   DSYMUTIL arch-darwin-c-debug/lib/libpetsc.3.021.0.dylib
=
Now to check if the libraries are working do:
make PETSC_DIR=/Users/petsc/petsc.x PETSC_ARCH=arch-darwin-c-debug check
=
Running PETSc check examples to verify correct installation
Using PETSC_DIR=/Users/petsc/petsc.x and PETSC_ARCH=arch-darwin-c-debug
C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes
C/C++ example src/snes/tutorials/ex19 run successfully with MUMPS
Fortran example src/snes/tutorials/ex5f run successfully with 1 MPI process
Completed PETSc check examples
petsc@npro petsc.x %

petsc@npro petsc.z % ./configure 
--download-openmpi=https://urldefense.us/v3/__https://download.open-mpi.org/release/open-mpi/v5.0/openmpi-5.0.3rc1.tar.bz2__;!!G_uCfscf7eWS!adsQx6TTtuIrMPz2ZorSv0iNPff5Fm5aFUi4i9n_E7v3GfAc8XCDtrlfVFyfPEgDFe4lKLL1XzkJ3c_97YoeukE$
  --download-mumps --download-scalapack && make && make check

   DSYMUTIL arch-darwin-c-debug/lib/libpetsc.3.021.0.dylib
=
Now to check if the libraries are working do:
make PETSC_DIR=/Users/petsc/petsc.z PETSC_ARCH=arch-darwin-c-debug check
=
Running PETSc check examples to verify correct installation
Using PETSC_DIR=/Users/petsc/petsc.z and PETSC_ARCH=arch-darwin-c-debug
C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes
C/C++ example src/snes/tutorials/ex19 run successfully with MUMPS
Fortran example src/snes/tutorials/ex5f run successfully with 1 MPI process
Completed PETSc check examples
petsc@npro petsc.z % 

[however parmmg and pastix are failing to build with openmpi]

And I thought this worked for me yesterday - but I see failures now.

./configure --download-bison --download-chaco --download-ctetgen 
--download-eigen --download-fftw --download-hdf5 --download-hpddm 
--download-hwloc --download-hwloc-configure-arguments=--disable-opencl 
--download-hypre --download-libpng --download-metis --download-mmg 
--download-mpich --download-mpich-configure-arguments=--disable-opencl 
--download-mumps --download-netcdf --download-openblas 
--download-openblas-make-options="'USE_THREAD=0 USE_LOCKING=1 USE_OPENMP=0'" 
--download-parmmg --download-pastix --download-pnetcdf --download-pragmatic 
--download-ptscotch --download-scalapack --download-slepc 
--download-suitesparse --download-superlu_dist --download-tetgen 
--download-triangle --with-c2html=0 --with-debugging=1 
--with-fortran-bindings=0 --with-shared-libraries=1 --with-x=0 --with-zlib 
--COPTFLAGS=-O0 --FOPTFLAGS=-O0 --LDFLAGS=-Wl,-ld_classic --with-clean

Satish

On Sat, 30 Mar 2024, Barry Smith wrote:

> 
>   Can you check the value of IRHSCOMP in the debugger? Using gdb as the 
> debugger may work better for this. 
> 
>   Barry
> 
> 
> > On Mar 30, 2024, at 3:46 AM, zeyu xia  wrote:
> > 
> > This Message Is From an External Sender
> > This message came from outside your organization.
> > Hi! Thanks for your reply.
> > 
> > There still exist some problems, as seen in the files 'configure.log', 
> > 'make check3.txt', and 'debug.txt' in the attachment. Particularly, the 
> > file 'debug.txt' contains the output of bt command of lldb.
> > 
> > Thanks for your attention.
> > 
> > Best regards,
> > Zeyu Xia
> > 
> > 
> > Satish Balay mailto:ba...@mcs.anl.gov>> 于2024年3月30日周六 
> > 02:52写道:
> >> I'm able to reproduce this error on a slightly older xcode [but don't know 
> >> why this issue comes up]
> >> 
> >> > Apple clang version 15.0.0 (clang-1500.1.0.2.5)
> >> 
> >> Can you try using the additional configure options (along with 
> >> LDFLAGS=-Wl,-ld_classic)  and see if it works?
> >> 
> >> COPTFLAGS=-O0 FOPTFLAGS=-O0
> >> 
> >> Satish
> >> 
> >> On Fri, 29 Mar 2024, zeyu xia wrote:
> >> 
> >> > Hi! I am grateful for your prompt response.
> >> > 
> >> > I follow your suggestions, and however, it still does not work. For the
> >> > related information please find the files 'make check2.txt' and
> >> > 'configure.log' in the attachment.
> >> > 
> >> > If possible, please do me a favor again. Thanks for your patience.
> >> > 
> >> > Best wishes,
> >> > Zeyu Xia
> >> > 
> >> > 
> >> > Satish Balay mailto:ba...@mcs.anl.gov>> 
> >> > 于2024年3月29日周五 23:48写道:
> >> > 
> >> > > Could you:
> >> > >
> >> > > - reinstall brew after the xcode upgrade (not just update)
> >> > > 

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-03-29 Thread Satish Balay via petsc-users
I'm able to reproduce this error on a slightly older xcode [but don't know why 
this issue comes up]

> Apple clang version 15.0.0 (clang-1500.1.0.2.5)

Can you try using the additional configure options (along with 
LDFLAGS=-Wl,-ld_classic)  and see if it works?

COPTFLAGS=-O0 FOPTFLAGS=-O0

Satish

On Fri, 29 Mar 2024, zeyu xia wrote:

> Hi! I am grateful for your prompt response.
> 
> I follow your suggestions, and however, it still does not work. For the
> related information please find the files 'make check2.txt' and
> 'configure.log' in the attachment.
> 
> If possible, please do me a favor again. Thanks for your patience.
> 
> Best wishes,
> Zeyu Xia
> 
> 
> Satish Balay  于2024年3月29日周五 23:48写道:
> 
> > Could you:
> >
> > - reinstall brew after the xcode upgrade (not just update)
> > https://urldefense.us/v3/__https://petsc.org/main/install/install/*installing-on-macos__;Iw!!G_uCfscf7eWS!dGItos-D58VSJn4kOlKy2TEX-PWhflbWfNuM0zqhEXbGniD5S13iWCxgBmg9wYk4OrSwaP6jjzANIHN1ZHATKXE$
> >  
> > - not use --LDFLAGS=-Wl,-ld_classic
> >
> > And see if the problem persists?
> >
> > Satish
> >
> > On Fri, 29 Mar 2024, zeyu xia wrote:
> >
> > > Dear PETSc team:
> > >
> > > Recently I installed firedrake on MacOS (arm64) with the latest
> > > Xcode, and there seems some error with mumps. I ran two times of the
> > > command `make check`. The first time it just output wrong results, and
> > the
> > > second time it raised an error with Segmentation Violation. Please see
> > the
> > > files “make check.txt” and “configure.log” in the attachment.
> > >
> > > I will certainly be happy and grateful if you can take some time
> > to
> > > deal with this problem. Thanks for your patience.
> > >
> > > Best wishes,
> > > Zeyu Xia
> > >
> >
> 

Re: [petsc-users] [External] Re: Does ILU(15) still make sense or should just use LU?

2024-03-29 Thread Satish Balay via petsc-users
On Fri, 29 Mar 2024, Pfeiffer, Sharon wrote:

> I’d like to unsubscribe to this mailing list.

Done.

Note: every list e-mail provides this info [in headers]

List-Id: PETSc users list 
List-Unsubscribe: 
,

List-Archive: 

List-Post: 
List-Help: 
List-Subscribe: 
,


Satish

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-03-29 Thread Satish Balay via petsc-users
Could you:

- reinstall brew after the xcode upgrade (not just update) 
https://urldefense.us/v3/__https://petsc.org/main/install/install/*installing-on-macos__;Iw!!G_uCfscf7eWS!fr-mLHdhIQgT2IBZhK9C2IQMUAmTmneTF38VsNLrywxooidf1uunovfx8qJrr8-Y73tICazCqyaZ6SJ6ca6JXnQ$
 
- not use --LDFLAGS=-Wl,-ld_classic

And see if the problem persists?

Satish

On Fri, 29 Mar 2024, zeyu xia wrote:

> Dear PETSc team:
> 
> Recently I installed firedrake on MacOS (arm64) with the latest
> Xcode, and there seems some error with mumps. I ran two times of the
> command `make check`. The first time it just output wrong results, and the
> second time it raised an error with Segmentation Violation. Please see the
> files “make check.txt” and “configure.log” in the attachment.
> 
> I will certainly be happy and grateful if you can take some time to
> deal with this problem. Thanks for your patience.
> 
> Best wishes,
> Zeyu Xia
> 


Re: [petsc-users] Using PetscPartitioner on WINDOWS

2024-03-21 Thread Satish Balay via petsc-users
Checking for program /usr/bin/git...not found
Checking for program 
/cygdrive/c/Users/Akun/AppData/Local/Programs/Git/bin/git...found

Also you appear to not use 'cygwin git' I'm not sure if this alternative git 
would work or break - so either install/use cygwin git - or use tarballs.

Satish

On Thu, 21 Mar 2024, Satish Balay via petsc-users wrote:

> Delete your old build files - and retry. i.e
> 
> rm -rf /cygdrive/g/mypetsc/petsc-3.20.5/arch-mswin-c-opt
> 
> ./configure 
> 
> Satish
> 
> 
> On Thu, 21 Mar 2024, 程奔 wrote:
> 
> > Hi, Satish Thanks for your reply, I try both way your said in petsc-3. 20. 
> > 5 but it encounter same question, 
> > *
> >  UNABLE to CONFIGURE with GIVEN OPTIONS
> > ZjQcmQRYFpfptBannerStart
> > This Message Is From an External Sender
> > This message came from outside your organization.
> >  
> > ZjQcmQRYFpfptBannerEnd
> > 
> > Hi, Satish
> > Thanks for your reply, I try both way your said in petsc-3.20.5 
> > but it encounter same question,
> > 
> > *
> > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> > details):
> >  
> > -
> >   Error running make on  METIS
> >  
> > *
> > 
> > I send configure.log with "--download-parmetis --download-metis" to you and 
> > ask for you help.
> > 
> > sinserely,
> > Ben.
> > 
> > > -原始邮件-
> > > 发件人: "Satish Balay" 
> > > 发送时间:2024-03-20 21:29:56 (星期三)
> > > 收件人: 程奔 
> > > 抄送: petsc-users 
> > > 主题: Re: [petsc-users] Using PetscPartitioner on WINDOWS
> > > 
> > > >>>>
> > > Configure Options: --configModules=PETSc.Configure 
> > > --optionsModule=config.compilerOptions --with-debugging=0 
> > > --with-cc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_cl 
> > > --with-fc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_ifort
> > >  --with-cxx=/cygdrive/g/mypetsc/p
> > etsc-3.20.2/lib/petsc/bin/win32fe/win_cl 
> > --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023.2.0/lib/intel64 
> > mkl-intel-lp64-dll.lib mkl-sequential-dll.lib mkl-core-dll.lib 
> > --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> > --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/20
> > 21.10.0/lib/release/impi.lib 
> > --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> > -localonly 
> > --download-parmetis=/cygdrive/g/mypetsc/petsc-pkg-parmetis-475d8facbb32.tar.gz
> >  --download-metis=/cygdrive/g/mypetsc/petsc-pkg-metis-ca7a59e6283f.tar.gz 
> > --with-strict-petscerrorcode=0
> > > <<<
> > > 
> > > >>>>>>>>
> > > Warning: win32fe: File Not Found: /Ox
> > > Error: win32fe: Input File Not Found: 
> > > G:\mypetsc\PETSC-~2.2\ARCH-M~1\EXTERN~1\PETSC-~1\PETSC-~1\libmetis\/Ox
> > > >>>>>>>>>>
> > > 
> > > Looks like you are using an old snapshot of metis. Can you remove your 
> > > local tarballs - and let [cygwin] git download the appropriate latest 
> > > version?
> > > 
> > > Or download and use: 
> > > https://urldefense.us/v3/__https://bitbucket.org/petsc/pkg-metis/get/8b194fdf09661ac41b36fa16db0474d38f46f1ac.tar.gz__;!!G_uCfscf7eWS!cI7AqtwOwG0MWFBbcOA813z8p7Q2IZcdv53HvzHMxK37qmlicGatMh0ya5WWcEVfiYZ5JDmS7vfgveYi05xU_O8moU6wWwAikw$
> > > Similarly for parmetis 
> > > https://urldefense.us/v3/__https://bitbucket.org/petsc/pkg-parmetis/get/f5e3aab04fd5fe6e09fa02f885c1c29d349f9f8b.tar.gz__;!!G_uCfscf7eWS!cI7AqtwOwG0MWFBbcOA813z8p7Q2IZcdv53HvzHMxK37qmlicGatMh0ya5WWcEVfiYZ5JDmS7vfgveYi05xU_O8moU4L6tLXtg$
> > > 
> > > Satish
> > > 
> > > On Wed, 20 Mar 2024, 程奔 wrote:
> > > 
> > > > Hi I try petsc-3. 20. 2 and petsc-3. 20. 5 with configure ./configure 
> > > > --with-debugging=0 --with-cc=cl --with-fc=ifort --with-cxx=cl 
> > > > --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023. 2. 
> > > > 0/lib/intel64 mkl-intel-lp64-dll. lib
> > > > mkl-sequential-dll. lib
> > > > ZjQcmQRYFpfptBannerStart
> > > > This Message Is From an

Re: [petsc-users] Using PetscPartitioner on WINDOWS

2024-03-21 Thread Satish Balay via petsc-users
Delete your old build files - and retry. i.e

rm -rf /cygdrive/g/mypetsc/petsc-3.20.5/arch-mswin-c-opt

./configure 

Satish


On Thu, 21 Mar 2024, 程奔 wrote:

> Hi, Satish Thanks for your reply, I try both way your said in petsc-3. 20. 5 
> but it encounter same question, 
> *
>  UNABLE to CONFIGURE with GIVEN OPTIONS
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>  
> ZjQcmQRYFpfptBannerEnd
> 
> Hi, Satish
> Thanks for your reply, I try both way your said in petsc-3.20.5 
> but it encounter same question,
> 
> *
> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> details):
>  
> -
>   Error running make on  METIS
>  
> *
> 
> I send configure.log with "--download-parmetis --download-metis" to you and 
> ask for you help.
> 
> sinserely,
> Ben.
> 
> > -原始邮件-
> > 发件人: "Satish Balay" 
> > 发送时间:2024-03-20 21:29:56 (星期三)
> > 收件人: 程奔 
> > 抄送: petsc-users 
> > 主题: Re: [petsc-users] Using PetscPartitioner on WINDOWS
> > 
> > 
> > Configure Options: --configModules=PETSc.Configure 
> > --optionsModule=config.compilerOptions --with-debugging=0 
> > --with-cc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_cl 
> > --with-fc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_ifort 
> > --with-cxx=/cygdrive/g/mypetsc/p
> etsc-3.20.2/lib/petsc/bin/win32fe/win_cl 
> --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023.2.0/lib/intel64 
> mkl-intel-lp64-dll.lib mkl-sequential-dll.lib mkl-core-dll.lib 
> --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/20
> 21.10.0/lib/release/impi.lib 
> --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec -localonly 
> --download-parmetis=/cygdrive/g/mypetsc/petsc-pkg-parmetis-475d8facbb32.tar.gz
>  --download-metis=/cygdrive/g/mypetsc/petsc-pkg-metis-ca7a59e6283f.tar.gz 
> --with-strict-petscerrorcode=0
> > <<<
> > 
> > 
> > Warning: win32fe: File Not Found: /Ox
> > Error: win32fe: Input File Not Found: 
> > G:\mypetsc\PETSC-~2.2\ARCH-M~1\EXTERN~1\PETSC-~1\PETSC-~1\libmetis\/Ox
> > >>
> > 
> > Looks like you are using an old snapshot of metis. Can you remove your 
> > local tarballs - and let [cygwin] git download the appropriate latest 
> > version?
> > 
> > Or download and use: 
> > https://urldefense.us/v3/__https://bitbucket.org/petsc/pkg-metis/get/8b194fdf09661ac41b36fa16db0474d38f46f1ac.tar.gz__;!!G_uCfscf7eWS!cI7AqtwOwG0MWFBbcOA813z8p7Q2IZcdv53HvzHMxK37qmlicGatMh0ya5WWcEVfiYZ5JDmS7vfgveYi05xU_O8moU6wWwAikw$
> > Similarly for parmetis 
> > https://urldefense.us/v3/__https://bitbucket.org/petsc/pkg-parmetis/get/f5e3aab04fd5fe6e09fa02f885c1c29d349f9f8b.tar.gz__;!!G_uCfscf7eWS!cI7AqtwOwG0MWFBbcOA813z8p7Q2IZcdv53HvzHMxK37qmlicGatMh0ya5WWcEVfiYZ5JDmS7vfgveYi05xU_O8moU4L6tLXtg$
> > 
> > Satish
> > 
> > On Wed, 20 Mar 2024, 程奔 wrote:
> > 
> > > Hi I try petsc-3. 20. 2 and petsc-3. 20. 5 with configure ./configure 
> > > --with-debugging=0 --with-cc=cl --with-fc=ifort --with-cxx=cl 
> > > --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023. 2. 
> > > 0/lib/intel64 mkl-intel-lp64-dll. lib
> > > mkl-sequential-dll. lib
> > > ZjQcmQRYFpfptBannerStart
> > > This Message Is From an External Sender
> > > This message came from outside your organization.
> > >  
> > > ZjQcmQRYFpfptBannerEnd
> > > 
> > > Hi 
> > > I try petsc-3.20.2 and petsc-3.20.5 with configure 
> > > 
> > > ./configure  --with-debugging=0  --with-cc=cl --with-fc=ifort 
> > > --with-cxx=cl  
> > > --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023.2.0/lib/intel64 
> > > mkl-intel-lp64-dll.lib mkl-sequential-dll.lib mkl-core-dll.lib 
> > > --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> > > --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib
> > >  
> > > --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> > > -localonly 
> > > --download-parmetis=/cygdrive/g/mypetsc/petsc-pkg-parmetis-475d8facbb32.tar.gz
> > >  
> > > --download-metis=/cygdrive/g/mypetsc/petsc-pkg-metis-ca7a59e6283f.tar.gz 
> > > --with-strict-petscerrorcode=0
> > > 
> > > but it encounter same question,
> > > 
> > > *
> > >UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> > > details):
> > > -
> > >  Error running make on  

Re: [petsc-users] Using PetscPartitioner on WINDOWS

2024-03-20 Thread Satish Balay via petsc-users

Configure Options: --configModules=PETSc.Configure 
--optionsModule=config.compilerOptions --with-debugging=0 
--with-cc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_cl 
--with-fc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_ifort 
--with-cxx=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_cl 
--with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023.2.0/lib/intel64 
mkl-intel-lp64-dll.lib mkl-sequential-dll.lib mkl-core-dll.lib 
--with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
--with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib 
--with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec -localonly 
--download-parmetis=/cygdrive/g/mypetsc/petsc-pkg-parmetis-475d8facbb32.tar.gz 
--download-metis=/cygdrive/g/mypetsc/petsc-pkg-metis-ca7a59e6283f.tar.gz 
--with-strict-petscerrorcode=0
<<<


Warning: win32fe: File Not Found: /Ox
Error: win32fe: Input File Not Found: 
G:\mypetsc\PETSC-~2.2\ARCH-M~1\EXTERN~1\PETSC-~1\PETSC-~1\libmetis\/Ox
>>

Looks like you are using an old snapshot of metis. Can you remove your local 
tarballs - and let [cygwin] git download the appropriate latest version?

Or download and use: 
https://urldefense.us/v3/__https://bitbucket.org/petsc/pkg-metis/get/8b194fdf09661ac41b36fa16db0474d38f46f1ac.tar.gz__;!!G_uCfscf7eWS!dgDT-Y8-6OviQz8-EOWxbNfMKo9bSBj18tYI-sFf-iCfuyKZ10s6q3DBpWS1Ha51iWExhoSpsV69NQ8uEN5mq_A$
 
Similarly for parmetis 
https://urldefense.us/v3/__https://bitbucket.org/petsc/pkg-parmetis/get/f5e3aab04fd5fe6e09fa02f885c1c29d349f9f8b.tar.gz__;!!G_uCfscf7eWS!dgDT-Y8-6OviQz8-EOWxbNfMKo9bSBj18tYI-sFf-iCfuyKZ10s6q3DBpWS1Ha51iWExhoSpsV69NQ8uTmPyVKM$
 

Satish

On Wed, 20 Mar 2024, 程奔 wrote:

> Hi I try petsc-3. 20. 2 and petsc-3. 20. 5 with configure ./configure 
> --with-debugging=0 --with-cc=cl --with-fc=ifort --with-cxx=cl 
> --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023. 2. 0/lib/intel64 
> mkl-intel-lp64-dll. lib
> mkl-sequential-dll. lib
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>  
> ZjQcmQRYFpfptBannerEnd
> 
> Hi 
> I try petsc-3.20.2 and petsc-3.20.5 with configure 
> 
> ./configure  --with-debugging=0  --with-cc=cl --with-fc=ifort --with-cxx=cl  
> --with-blaslapack-lib=-L/cygdrive/g/Intel/oneAPI/mkl/2023.2.0/lib/intel64 
> mkl-intel-lp64-dll.lib mkl-sequential-dll.lib mkl-core-dll.lib 
> --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib 
> --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec -localonly 
> --download-parmetis=/cygdrive/g/mypetsc/petsc-pkg-parmetis-475d8facbb32.tar.gz
>  
> --download-metis=/cygdrive/g/mypetsc/petsc-pkg-metis-ca7a59e6283f.tar.gz 
> --with-strict-petscerrorcode=0
> 
> but it encounter same question,
> 
> *
>UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> details):
> -
>  Error running make on  METIS
> *
> 
> and I see 
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6412623047__;!!G_uCfscf7eWS!YOO7nEnwU4BJQXD3WkP3QCvaT1gfLxBxnrNdXp9SJbjETmw7uaRKaUkPRPEgWgxibROg8o_rr8SxbnaVWtJbAT-t281f2ha4Aw$
>  for a successful build of lates
> t petsc-3.20 , it seem have something called "sowing" and "bison" , but I 
> don't have.
> 
> So I ask for your help, and configure.log is attached.
> 
> sinserely,
> Ben.
> 
> > -原始邮件-
> > 发件人: "Satish Balay" 
> > 发送时间:2024-03-20 00:48:57 (星期三)
> > 收件人: "Barry Smith" 
> > 抄送: 程奔 , PETSc 
> > 主题: Re: [petsc-users] Using PetscPartitioner on WINDOWS
> > 
> > Check 
> > https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6412623047__;!!G_uCfscf7eWS!YOO7nEnwU4BJQXD3WkP3QCvaT1gfLxBxnrNdXp9SJbjETmw7uaRKaUkPRPEgWgxibROg8o_rr8SxbnaVWtJbAT-t281f2ha4Aw$
> >  for a successful build of latest 
> petsc-3.20 [i.e release branch in git] with metis and parmetis
> > 
> > Note the usage:
> > 
> > >
> > '--with-cc=cl',
> > '--with-cxx=cl',
> > '--with-fc=ifort',
> > 
> > 
> > Satish
> > 
> > On Tue, 19 Mar 2024, Barry Smith wrote:
> > 
> > > Are you not able to use PETSc 3. 20. 2 ? On Mar 19, 2024, at 5: 27 AM, 程奔 
> > >  wrote: Hi,Barry I try to use PETSc 
> > > version 3. 19. 5 on windows, but it encounter a problem.
> > > *
> > > ZjQcmQRYFpfptBannerStart
> > > This Message Is From an External Sender
> > > This message came from outside your organization.
> > >  
> > > ZjQcmQRYFpfptBannerEnd
> > > 
> > >   Are you not able to use PETSc 3.20.2 ?
> > > 
> > 

Re: [petsc-users] Using PetscPartitioner on WINDOWS

2024-03-19 Thread Satish Balay via petsc-users
Check 
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6412623047__;!!G_uCfscf7eWS!ZAg_b85bAvm8-TShDMHvxaXIu77pjwlDqU2g9AXQSNNw0gmk3peDktdf8MsGAq3jHLTJHo6WSPGyEe5QrCJ-fN0$
  for a successful build of latest petsc-3.20 [i.e release branch in git] with 
metis and parmetis

Note the usage:

>
'--with-cc=cl',
'--with-cxx=cl',
'--with-fc=ifort',


Satish

On Tue, 19 Mar 2024, Barry Smith wrote:

> Are you not able to use PETSc 3. 20. 2 ? On Mar 19, 2024, at 5: 27 AM, 程奔 
>  wrote: Hi,Barry I try to use PETSc version 
> 3. 19. 5 on windows, but it encounter a problem.
> *
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>  
> ZjQcmQRYFpfptBannerEnd
> 
>   Are you not able to use PETSc 3.20.2 ?
> 
>   On Mar 19, 2024, at 5:27 AM, 程奔  wrote:
> 
> Hi,Barry
> 
> I try to use PETSc version 3.19.5 on windows, but it encounter a problem.
> 
> 
>  
> *
>            UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> details):
> -
>                               Error configuring METIS with CMake
> *
> 
> configure.log is attached.
> 
> 
> Looking forward to your reply!
> 
> sinserely,
> 
> Ben.
> 
> 
> 
>   -原始邮件-
>   发件人: "Barry Smith" 
>   发送时间: 2024-03-18 21:11:14 (星期一)
>   收件人: 程奔 <202321009...@mail.scut.edu.cn>
>   抄送: petsc-users@mcs.anl.gov
>   主题: Re: [petsc-users] Using PetscPartitioner on WINDOWS
> 
> 
> Please switch to the latest PETSc version, it supports Metis and Parmetis on 
> Windows.
>   Barry
> 
> 
>   On Mar 17, 2024, at 11:57 PM, 程奔 <202321009...@mail.scut.edu.cn> wrote:
> 
> This Message Is From an External Sender 
> This message came from outside your organization.
> 
> Hello,
> 
> Recently I try to install PETSc with Cygwin since I'd like to use PETSc with 
> Visual Studio on Windows10 plateform.For the sake of clarity, I firstly list 
> the softwares/packages used below:
> 1. PETSc: version 3.16.5
> 2. VS: version 2022 
> 3. Intel MPI: download Intel oneAPI Base Toolkit and HPC Toolkit
> 4. Cygwin
> 
> 
> On windows,
> Then I try to calculate a simple cantilever beam  that use Tetrahedral mesh.  
> So it's  unstructured grid
> I use DMPlexCreateFromFile() to creat dmplex.
> 
> And then I want to distributing the mesh for using  PETSCPARTITIONERPARMETIS 
> type(in my opinion this PetscPartitioner type maybe the best for dmplex,
> 
> see fig 1 for my work to see different PetscPartitioner type about a  
> cantilever beam in Linux system.)
> 
> But unfortunatly, when i try to use parmetis on windows that configure PETSc 
> as follows
> 
> 
>  ./configure  --with-debugging=0  --with-cc='win32fe cl' --with-fc='win32fe 
> ifort' --with-cxx='win32fe cl'  
> 
> --download-fblaslapack=/cygdrive/g/mypetsc/petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
>   --with-shared-libraries=0 
> 
> --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include
>  --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib 
> --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> --download-parmetis=/cygdrive/g/mypetsc/petsc-pkg-parmetis-475d8facbb32.tar.gz
>  
> --download-metis=/cygdrive/g/mypetsc/petsc-pkg-metis-ca7a59e6283f.tar.gz 
> 
> 
> 
> 
> it shows that 
> ***
> External package metis does not support --download-metis with Microsoft 
> compilers
> ***
> configure.log and make.log is attached
> 
> 
> 
> If I use PetscPartitioner Simple type the calculate time is much more than 
> PETSCPARTITIONERPARMETIS type.
> 
> So On windows system I want to use PetscPartitioner like parmetis , if there 
> have any other PetscPartitioner type that can do the same work as parmetis, 
> 
> or I just try to download parmetis  separatly on windows(like this website , 
> https://urldefense.us/v3/__https://boogie.inm.ras.ru/terekhov/INMOST/-/wikis/0204-Compilation-ParMETIS-Windows__;!!G_uCfscf7eWS!ZAg_b85bAvm8-TShDMHvxaXIu77pjwlDqU2g9AXQSNNw0gmk3peDktdf8MsGAq3jHLTJHo6WSPGyEe5Qgw3sA7A$
>  ) 
> 
> and then use Visual Studio to use it's library I don't know in this way PETSc 
> could use it successfully or not.
> 
> 
> So I wrrit this email to report my problem and ask for your help.
> 
> Looking forward your reply!
> 
> 
> sinserely,
> Ben.
> 
> 
> 
> 
> 
> 


Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-18 Thread Satish Balay via petsc-users
On Mon, 18 Mar 2024, Pierre Jolivet wrote:

> 
> And here we go: 
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6420606887__;!!G_uCfscf7eWS!alfBlmyFQ5JJUYKxxFdETav6xjHOl5W54BPrmJEyXdSakVXnj8eYIRZdknOI-FK4uiaPdL4zSdJlD2zrcw$
>  
> 20 minutes in, and still in the dm_* tests with timeouts right, left, and 
> center.
> For reference, this prior job 
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6418468279__;!!G_uCfscf7eWS!alfBlmyFQ5JJUYKxxFdETav6xjHOl5W54BPrmJEyXdSakVXnj8eYIRZdknOI-FK4uiaPdL4zSdJj83LENQ$
>   completed in 3 minutes (OK, maybe add a couple of minutes to rebuild the 
> packages to have a fair comparison).
> What did they do to OpenBLAS? Add a sleep() in their axpy?

(gdb) r
Starting program: /home/petsc/petsc/src/dm/dt/tests/ex13 
^C
Program received signal SIGINT, Interrupt.
0xf331ad10 in dgemm_otcopy (m=m@entry=8, n=n@entry=7, 
a=a@entry=0x58f150, lda=lda@entry=15, b=b@entry=0xefae) at 
../kernel/arm64/../generic/gemm_tcopy_2.c:69
69*(b_offset1 + 3) = *(a_offset2 + 1);
(gdb) where
#0  0xf331ad10 in dgemm_otcopy (m=m@entry=8, n=n@entry=7, 
a=a@entry=0x58f150, lda=lda@entry=15, b=b@entry=0xefae) at 
../kernel/arm64/../generic/gemm_tcopy_2.c:69
#1  0xf3342e68 in dgetrf_single (args=args@entry=0xe9d8, 
range_m=range_m@entry=0x0, range_n=range_n@entry=0x0, 
sa=sa@entry=0xefae, sb=, myid=myid@entry=0) at 
getrf_single.c:157
#2  0xf3255ec4 in dgetrf_ (M=, N=, 
a=, ldA=, ipiv=, 
Info=0xeaa8) at lapack/getrf.c:110
#3  0xf50b8dd8 in MatLUFactor_SeqDense (A=0x598360, row=0x0, col=0x0, 
minfo=0xeba8) at /home/petsc/petsc/src/mat/impls/dense/seq/dense.c:801
#4  0xf559b8b4 in MatLUFactor (mat=0x598360, row=0x0, col=0x0, 
info=0xeba8) at /home/petsc/petsc/src/mat/interface/matrix.c:3087
#5  0x004149e0 in test (dim=2, deg=3, form=-1, jetDegree=3, 
cond=PETSC_FALSE) at ex13.c:141
#6  0x00418f20 in main (argc=1, argv=0xf158) at ex13.c:303
(gdb) 

It appears to get stuck in a loop here.

This test runs fine - if I remove 
"--download-openblas-make-options=TARGET=GENERIC" option.

Ok - trying out "git bisect"

ea6c5f3cf553a23f8e2e787307805e7874e1f9c6 is the first bad commit
commit ea6c5f3cf553a23f8e2e787307805e7874e1f9c6
Author: Martin Kroeker 
Date:   Sun Oct 30 12:55:23 2022 +0100

Add option RELAPACK_REPLACE

 Makefile.rule   | 5 -
 Makefile.system | 4 
 2 files changed, 8 insertions(+), 1 deletion(-)

Don't really understand why this change is triggering this hang. Or the correct 
way to build latest openblas [do we need "BUILD_RELAPACK=1"?]

Satish


Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-18 Thread Satish Balay via petsc-users
On Mon, 18 Mar 2024, Satish Balay via petsc-users wrote:

> On Mon, 18 Mar 2024, Pierre Jolivet wrote:
> 
> > 
> > 
> > > On 18 Mar 2024, at 5:13 PM, Satish Balay via petsc-users 
> > >  wrote:
> > > 
> > > Ah - the compiler did flag code bugs.
> > > 
> > >> (current version is 0.3.26 but we can’t update because there is a huge 
> > >> performance regression which makes the pipeline timeout)
> > > 
> > > maybe we should retry - updating to the latest snapshot and see if this 
> > > issue persists.
> > 
> > Well, that’s easy to see it is _still_ broken: 
> > https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6419779589__;!!G_uCfscf7eWS!f4svx7Rv1mmcLfy5l0C9bXXrw9gwb49ykkTb28IAtZW0VgZ8vgdD8exUOZSL0TCEqqP5X-p-0ll6TetPkw$
> >  
> > The infamous gcc segfault that can’t let us run the pipeline, but that 
> > builds fine when it’s you that connect to the machine (I bothered you about 
> > this a couple of months ago in case you don’t remember, see 
> > https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7143__;!!G_uCfscf7eWS!f4svx7Rv1mmcLfy5l0C9bXXrw9gwb49ykkTb28IAtZW0VgZ8vgdD8exUOZSL0TCEqqP5X-p-0llrLiE4GQ$
> >  ).
> 
> > make[2]: *** [../../Makefile.tail:46: libs] Bus error (core dumped)
> 
> Ah - ok - that's a strange error. I'm not sure how to debug it. [it fails 
> when the build is invoked from configure - but not when its invoked directly 
> from bash/shell.]

Pushed a potential workaround to jolivet/test-openblas

Note: The failure comes up on same OS (Fedora 39) on X64 aswell.

Satish

> 
> Satish
> 
> > 
> > Thanks,
> > Pierre
> > 
> > > 
> > > Satish
> > > 
> > > On Mon, 18 Mar 2024, Zongze Yang wrote:
> > > 
> > >> The issue of openblas was resolved by this pr 
> > >> https://urldefense.us/v3/__https://github.com/OpenMathLib/OpenBLAS/pull/4565__;!!G_uCfscf7eWS!b09n5clcTFuLceLY_9KfqtSsgmmCIBLFbqciRVCKvnvFw9zTaNF8ssK0MiQlBOXUJe7H88nl-7ExdfhB-cMXLQ2d$
> > >>  
> > >> 
> > >> Best wishes,
> > >> Zongze
> > >> 
> > >>> On 18 Mar 2024, at 00:50, Zongze Yang  wrote:
> > >>> 
> > >>> It can be resolved by adding CFLAGS=-Wno-int-conversion. Perhaps the 
> > >>> default behaviour of the new version compiler has been changed?
> > >>> 
> > >>> Best wishes,
> > >>> Zongze
> > >>>> On 18 Mar 2024, at 00:23, Satish Balay  wrote:
> > >>>> 
> > >>>> Hm - I just tried a build with balay/xcode15-mpich - and that goes 
> > >>>> through fine for me. So don't know what the difference here is.
> > >>>> 
> > >>>> One difference is - I have a slightly older xcode. However your 
> > >>>> compiler appears to behave as using -Werror. Perhaps 
> > >>>> CFLAGS=-Wno-int-conversion will help here?
> > >>>> 
> > >>>> Satish
> > >>>> 
> > >>>> 
> > >>>> Executing: gcc --version
> > >>>> stdout:
> > >>>> Apple clang version 15.0.0 (clang-1500.3.9.4)
> > >>>> 
> > >>>> Executing: 
> > >>>> /Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/bin/mpicc -show
> > >>>> stdout: gcc -fPIC -fno-stack-check -Qunused-arguments -g -O0 
> > >>>> -Wno-implicit-function-declaration -fno-common 
> > >>>> -I/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/include 
> > >>>> -L/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/lib -lmpi 
> > >>>> -lpmpi
> > >>>> 
> > >>>> /Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/bin/mpicc -O2 
> > >>>> -DMAX_STACK_ALLOC=2048 -Wall -DF_INTERFACE_GFORT -fPIC -DNO_WARMUP 
> > >>>> -DMAX_CPU_NUMBER=12 -DMAX_PARALLEL_NUMBER=1 -DBUILD_SINGLE=1 
> > >>>> -DBUILD_DOUBLE=1 -DBUILD_COMPLEX=1 -DBUILD_COMPLEX16=1 
> > >>>> -DVERSION=\"0.3.21\" -march=armv8-a -UASMNAME -UASMFNAME -UNAME 
> > >>>> -UCNAME -UCHAR_NAME -UCHAR_CNAME -DASMNAME=_lapack_wrappers 
> > >>>> -DASMFNAME=_lapack_wrappers_ -DNAME=lapack_wrappers_ 
> > >>>> -DCNAME=lapack_wrappers -DCHAR_NAME=\"lapack_wrappers_\" 
> > >>>> -DCHAR_CNAME=\"lapack_wrappers\" -DNO_AFFINITY -I.. -c 
> > >>>> src/lapack_wrappers.c -o src

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-18 Thread Satish Balay via petsc-users
On Mon, 18 Mar 2024, Pierre Jolivet wrote:

> 
> 
> > On 18 Mar 2024, at 5:13 PM, Satish Balay via petsc-users 
> >  wrote:
> > 
> > Ah - the compiler did flag code bugs.
> > 
> >> (current version is 0.3.26 but we can’t update because there is a huge 
> >> performance regression which makes the pipeline timeout)
> > 
> > maybe we should retry - updating to the latest snapshot and see if this 
> > issue persists.
> 
> Well, that’s easy to see it is _still_ broken: 
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6419779589__;!!G_uCfscf7eWS!f4svx7Rv1mmcLfy5l0C9bXXrw9gwb49ykkTb28IAtZW0VgZ8vgdD8exUOZSL0TCEqqP5X-p-0ll6TetPkw$
>  
> The infamous gcc segfault that can’t let us run the pipeline, but that builds 
> fine when it’s you that connect to the machine (I bothered you about this a 
> couple of months ago in case you don’t remember, see 
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7143__;!!G_uCfscf7eWS!f4svx7Rv1mmcLfy5l0C9bXXrw9gwb49ykkTb28IAtZW0VgZ8vgdD8exUOZSL0TCEqqP5X-p-0llrLiE4GQ$
>  ).

> make[2]: *** [../../Makefile.tail:46: libs] Bus error (core dumped)

Ah - ok - that's a strange error. I'm not sure how to debug it. [it fails when 
the build is invoked from configure - but not when its invoked directly from 
bash/shell.]

Satish

> 
> Thanks,
> Pierre
> 
> > 
> > Satish
> > 
> > On Mon, 18 Mar 2024, Zongze Yang wrote:
> > 
> >> The issue of openblas was resolved by this pr 
> >> https://urldefense.us/v3/__https://github.com/OpenMathLib/OpenBLAS/pull/4565__;!!G_uCfscf7eWS!b09n5clcTFuLceLY_9KfqtSsgmmCIBLFbqciRVCKvnvFw9zTaNF8ssK0MiQlBOXUJe7H88nl-7ExdfhB-cMXLQ2d$
> >>  
> >> 
> >> Best wishes,
> >> Zongze
> >> 
> >>> On 18 Mar 2024, at 00:50, Zongze Yang  wrote:
> >>> 
> >>> It can be resolved by adding CFLAGS=-Wno-int-conversion. Perhaps the 
> >>> default behaviour of the new version compiler has been changed?
> >>> 
> >>> Best wishes,
> >>> Zongze
> >>>> On 18 Mar 2024, at 00:23, Satish Balay  wrote:
> >>>> 
> >>>> Hm - I just tried a build with balay/xcode15-mpich - and that goes 
> >>>> through fine for me. So don't know what the difference here is.
> >>>> 
> >>>> One difference is - I have a slightly older xcode. However your compiler 
> >>>> appears to behave as using -Werror. Perhaps CFLAGS=-Wno-int-conversion 
> >>>> will help here?
> >>>> 
> >>>> Satish
> >>>> 
> >>>> 
> >>>> Executing: gcc --version
> >>>> stdout:
> >>>> Apple clang version 15.0.0 (clang-1500.3.9.4)
> >>>> 
> >>>> Executing: 
> >>>> /Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/bin/mpicc -show
> >>>> stdout: gcc -fPIC -fno-stack-check -Qunused-arguments -g -O0 
> >>>> -Wno-implicit-function-declaration -fno-common 
> >>>> -I/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/include 
> >>>> -L/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/lib -lmpi 
> >>>> -lpmpi
> >>>> 
> >>>> /Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/bin/mpicc -O2 
> >>>> -DMAX_STACK_ALLOC=2048 -Wall -DF_INTERFACE_GFORT -fPIC -DNO_WARMUP 
> >>>> -DMAX_CPU_NUMBER=12 -DMAX_PARALLEL_NUMBER=1 -DBUILD_SINGLE=1 
> >>>> -DBUILD_DOUBLE=1 -DBUILD_COMPLEX=1 -DBUILD_COMPLEX16=1 
> >>>> -DVERSION=\"0.3.21\" -march=armv8-a -UASMNAME -UASMFNAME -UNAME -UCNAME 
> >>>> -UCHAR_NAME -UCHAR_CNAME -DASMNAME=_lapack_wrappers 
> >>>> -DASMFNAME=_lapack_wrappers_ -DNAME=lapack_wrappers_ 
> >>>> -DCNAME=lapack_wrappers -DCHAR_NAME=\"lapack_wrappers_\" 
> >>>> -DCHAR_CNAME=\"lapack_wrappers\" -DNO_AFFINITY -I.. -c 
> >>>> src/lapack_wrappers.c -o src/lapack_wrappers.o
> >>>> src/lapack_wrappers.c:570:81: error: incompatible integer to pointer 
> >>>> conversion passing 'blasint' (aka 'int') to parameter of type 'const 
> >>>> blasint *' (aka 'const int *'); take the address with & 
> >>>> [-Wint-conversion]
> >>>>   RELAPACK_sgemmt(uplo, transA, transB, n, k, alpha, A, ldA, B, ldB, 
> >>>> beta, C, info);
> >>>>  
> >>>>  ^~~~
> >>>>  

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-18 Thread Satish Balay via petsc-users
Ah - the compiler did flag code bugs.

> (current version is 0.3.26 but we can’t update because there is a huge 
> performance regression which makes the pipeline timeout)

maybe we should retry - updating to the latest snapshot and see if this issue 
persists.

Satish

On Mon, 18 Mar 2024, Zongze Yang wrote:

> The issue of openblas was resolved by this pr 
> https://urldefense.us/v3/__https://github.com/OpenMathLib/OpenBLAS/pull/4565__;!!G_uCfscf7eWS!b09n5clcTFuLceLY_9KfqtSsgmmCIBLFbqciRVCKvnvFw9zTaNF8ssK0MiQlBOXUJe7H88nl-7ExdfhB-cMXLQ2d$
>  
> 
> Best wishes,
> Zongze
> 
> > On 18 Mar 2024, at 00:50, Zongze Yang  wrote:
> > 
> > It can be resolved by adding CFLAGS=-Wno-int-conversion. Perhaps the 
> > default behaviour of the new version compiler has been changed?
> > 
> > Best wishes,
> > Zongze
> >> On 18 Mar 2024, at 00:23, Satish Balay  wrote:
> >> 
> >> Hm - I just tried a build with balay/xcode15-mpich - and that goes through 
> >> fine for me. So don't know what the difference here is.
> >> 
> >> One difference is - I have a slightly older xcode. However your compiler 
> >> appears to behave as using -Werror. Perhaps CFLAGS=-Wno-int-conversion 
> >> will help here?
> >> 
> >> Satish
> >> 
> >> 
> >> Executing: gcc --version
> >> stdout:
> >> Apple clang version 15.0.0 (clang-1500.3.9.4)
> >> 
> >> Executing: 
> >> /Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/bin/mpicc -show
> >> stdout: gcc -fPIC -fno-stack-check -Qunused-arguments -g -O0 
> >> -Wno-implicit-function-declaration -fno-common 
> >> -I/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/include 
> >> -L/Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/lib -lmpi -lpmpi
> >> 
> >> /Users/zzyang/workspace/repos/petsc/arch-darwin-c-debug/bin/mpicc -O2 
> >> -DMAX_STACK_ALLOC=2048 -Wall -DF_INTERFACE_GFORT -fPIC -DNO_WARMUP 
> >> -DMAX_CPU_NUMBER=12 -DMAX_PARALLEL_NUMBER=1 -DBUILD_SINGLE=1 
> >> -DBUILD_DOUBLE=1 -DBUILD_COMPLEX=1 -DBUILD_COMPLEX16=1 
> >> -DVERSION=\"0.3.21\" -march=armv8-a -UASMNAME -UASMFNAME -UNAME -UCNAME 
> >> -UCHAR_NAME -UCHAR_CNAME -DASMNAME=_lapack_wrappers 
> >> -DASMFNAME=_lapack_wrappers_ -DNAME=lapack_wrappers_ 
> >> -DCNAME=lapack_wrappers -DCHAR_NAME=\"lapack_wrappers_\" 
> >> -DCHAR_CNAME=\"lapack_wrappers\" -DNO_AFFINITY -I.. -c 
> >> src/lapack_wrappers.c -o src/lapack_wrappers.o
> >> src/lapack_wrappers.c:570:81: error: incompatible integer to pointer 
> >> conversion passing 'blasint' (aka 'int') to parameter of type 'const 
> >> blasint *' (aka 'const int *'); take the address with & [-Wint-conversion]
> >>RELAPACK_sgemmt(uplo, transA, transB, n, k, alpha, A, ldA, B, ldB, 
> >> beta, C, info);
> >>
> >> ^~~~
> >>
> >> &
> >> 
> >> vs:
> >> Executing: gcc --version
> >> stdout:
> >> Apple clang version 15.0.0 (clang-1500.1.0.2.5)
> >> 
> >> Executing: /Users/balay/petsc/arch-darwin-c-debug/bin/mpicc -show
> >> stdout: gcc -fPIC -fno-stack-check -Qunused-arguments -g -O0 
> >> -Wno-implicit-function-declaration -fno-common 
> >> -I/Users/balay/petsc/arch-darwin-c-debug/include 
> >> -L/Users/balay/petsc/arch-darwin-c-debug/lib -lmpi -lpmpi
> >> 
> >> 
> >> /Users/balay/petsc/arch-darwin-c-debug/bin/mpicc -O2 
> >> -DMAX_STACK_ALLOC=2048 -Wall -DF_INTERFACE_GFORT -fPIC -DNO_WARMUP 
> >> -DMAX_CPU_NUMBER=24 -DMAX_PARALLEL_NUMBER=1 -DBUILD_SINGLE=1 
> >> -DBUILD_DOUBLE=1 -DBUILD_COMPLEX=1 -DBUILD_COMPLEX16=1 
> >> -DVERSION=\"0.3.21\" -march=armv8-a -UASMNAME -UASMFNAME -UNAME -UCNAME 
> >> -UCHAR_NAME -UCHAR_CNAME -DASMNAME=_lapack_wrappers 
> >> -DASMFNAME=_lapack_wrappers_ -DNAME=lapack_wrappers_ 
> >> -DCNAME=lapack_wrappers -DCHAR_NAME=\"lapack_wrappers_\" 
> >> -DCHAR_CNAME=\"lapack_wrappers\" -DNO_AFFINITY -I.. -c 
> >> src/lapack_wrappers.c -o src/lapack_wrappers.o
> >> src/lapack_wrappers.c:570:81: warning: incompatible integer to pointer 
> >> conversion passing 'blasint' (aka 'int') to parameter of type 'const 
> >> blasint *' (aka 'const int *'); take the address with & [-Wint-conversion]
> >>RELAPACK_sgemmt(uplo, transA, transB, n, k, alpha, A, ldA, B, ldB, 
> >> beta, C, info);
> >>
> >> ^~~~
> >>
> >> &
> >> 
> >> 
> >> 
> >> 
> >> On Sun, 17 Mar 2024, Pierre Jolivet wrote:
> >> 
> >>> Ah, my bad, I misread linux-opt-arm as a macOS runner, no wonder the 
> >>> option is not helping…
> >>> Take Barry’s advice.
> >>> Furthermore, it looks like OpenBLAS people are steering in the opposite 
> >>> direction as us, by forcing the use of ld-classic 
> >>> 

Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on Windows by using Intel MPI

2024-03-12 Thread Satish Balay via petsc-users
Glad you have a successful build! Thanks for the update.

Satish

On Tue, 12 Mar 2024, 程奔 wrote:

> Hi Satish Sorry for replying to your email so late, I follow your suggestion 
> and it have been installed successfully. Thank you so much. best wishes, Ben 
> > -原始邮件- > 发件人: "Satish Balay"  > 发送时间: 
> 2024-03-06
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>  
> ZjQcmQRYFpfptBannerEnd
> 
> Hi Satish
>   Sorry for replying to your email so late, I follow your suggestion and it 
> have been installed successfully.
>   Thank you so much.
> 
> best
> wishes,
> Ben
> 
> 
> > -原始邮件-
> > 发件人: "Satish Balay" 
> > 发送时间:2024-03-06 18:21:45 (星期三)
> > 收件人: 程奔 
> > 抄送: petsc-users@mcs.anl.gov
> > 主题: Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on 
> > Windows by using Intel MPI
> > 
> > > make[3]: *** No rule to make target 'w'.  Stop.
> > 
> > Try the following to overcome the above error:
> > 
> > make OMAKE_PRINTDIR=make all
> > 
> > However 3.13.6 is a bit old - so don't know if it will work with these 
> > versions of compilers.
> > 
> > Satish
> > 
> > On Wed, 6 Mar 2024, 程奔 wrote:
> > 
> > > Hello,
> > > 
> > > 
> > > Last time I  installed PETSc 3.19.2 with Cygwin in Windows10 successfully.
> > > 
> > > Recently I try to install PETSc 3.13.6 with Cygwin since I'd like to use 
> > > PETSc with Visual Studio on Windows10 plateform.For the sake of clarity, 
> > > I firstly list the softwares/packages used below:
> > > 
> > > 1. PETSc: version 3.13.6
> > > 2. VS: version 2022 
> > > 3. Intel MPI: download Intel oneAPI Base Toolkit and HPC Toolkit
> > > 
> > > 
> > > 4. Cygwin
> > > 
> > > 5. External package: petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > And the compiler option in configuration is:
> > > 
> > > ./configure  --with-debugging=0  --with-cc='win32fe cl' 
> > > --with-fc='win32fe ifort' --with-cxx='win32fe cl'  
> > > 
> > > --download-fblaslapack=/cygdrive/g/mypetsc/petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
> > >   --with-shared-libraries=0 
> > > 
> > > --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> > > --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib
> > >  
> > > 
> > > --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> > > 
> > > 
> > > 
> > > 
> > > Then I build PETSc libraries with:
> > > 
> > > make PETSC_DIR=/cygdrive/g/mypetsc/petsc-3.13.6 
> > > PETSC_ARCH=arch-mswin-c-opt all
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > but there return an error:
> > > 
> > > **ERROR*
> > >   Error during compile, check arch-mswin-c-opt/lib/petsc/conf/make.log
> > >   Send it and arch-mswin-c-opt/lib/petsc/conf/configure.log to 
> > > petsc-ma...@mcs.anl.gov
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > So I wrrit this email to report my problem and ask for your help.  
> > > 
> > > 
> > > Looking forward your reply!
> > > 
> > > 
> > > sinserely,
> > > Cheng.
> > > 
> > > 
> > > 
> > > 
> 
> 


Re: [petsc-users] Broken links in FAQ

2024-03-08 Thread Satish Balay via petsc-users
The website is now updated

Satish

On Fri, 8 Mar 2024, Satish Balay via petsc-users wrote:

> Thanks for the report! The fix is at 
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7343__;!!G_uCfscf7eWS!aRgkooWLFmFbCqsaZyYQixeOgy0qD1N3WlxPMXIGCCA-fhjJ6DSuGhanT-xc5iuF4tjVn4BBShyMJnqZr2I0dEo$
>  
> 
> Satish
> 
> On Fri, 8 Mar 2024, David Bold wrote:
> 
> > Dear all, I noticed that the links to TS and PetscSF are broken in the FAQ 
> > on the website [1]. Unfortunately I do not have a gitlab. com account 
> > handy, so I could not open a bug. Best,
> > David [1]https: //urldefense. us/v3/__https: //petsc. 
> > org/release/*doc-index-citing-petsc__;Iw!!G_uCfscf7eWS!cA8OaCw8cHcgyL6gQl2uDCphmPd-jX0gmF1qUhry6mBj_WxHWgDp5mQ5tEdwo7zb84CgpHPnXFxh6-BjQVF1D
> > V6mN3Wj$
> > ZjQcmQRYFpfptBannerStart
> > This Message Is From an External Sender
> > This message came from outside your organization.
> >  
> > ZjQcmQRYFpfptBannerEnd
> > 
> > Dear all,
> > 
> > I noticed that the links to TS and PetscSF are broken in the FAQ on the 
> > website [1].
> > 
> > Unfortunately I do not have a gitlab.com account handy, so I could not 
> > open a bug.
> > 
> > Best,
> > David
> > 
> > [1] 
> > https://urldefense.us/v3/__https://petsc.org/release/*doc-index-citing-petsc__;Iw!!G_uCfscf7eWS!cA8OaCw8cHcgyL6gQl2uDCphmPd-jX0gmF1qUhry6mBj_WxHWgDp5mQ5tEdwo7zb84CgpHPnXFxh6-BjQVF1DV
> > 6mN3Wj$
> > 
> > 
> 


Re: [petsc-users] Broken links in FAQ

2024-03-08 Thread Satish Balay via petsc-users
Thanks for the report! The fix is at 
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7343__;!!G_uCfscf7eWS!aRgkooWLFmFbCqsaZyYQixeOgy0qD1N3WlxPMXIGCCA-fhjJ6DSuGhanT-xc5iuF4tjVn4BBShyMJnqZr2I0dEo$
 

Satish

On Fri, 8 Mar 2024, David Bold wrote:

> Dear all, I noticed that the links to TS and PetscSF are broken in the FAQ on 
> the website [1]. Unfortunately I do not have a gitlab. com account handy, so 
> I could not open a bug. Best,
> David [1]https: //urldefense. us/v3/__https: //petsc. 
> org/release/*doc-index-citing-petsc__;Iw!!G_uCfscf7eWS!cA8OaCw8cHcgyL6gQl2uDCphmPd-jX0gmF1qUhry6mBj_WxHWgDp5mQ5tEdwo7zb84CgpHPnXFxh6-BjQVF1D
> V6mN3Wj$
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>  
> ZjQcmQRYFpfptBannerEnd
> 
> Dear all,
> 
> I noticed that the links to TS and PetscSF are broken in the FAQ on the 
> website [1].
> 
> Unfortunately I do not have a gitlab.com account handy, so I could not 
> open a bug.
> 
> Best,
> David
> 
> [1] 
> https://urldefense.us/v3/__https://petsc.org/release/*doc-index-citing-petsc__;Iw!!G_uCfscf7eWS!cA8OaCw8cHcgyL6gQl2uDCphmPd-jX0gmF1qUhry6mBj_WxHWgDp5mQ5tEdwo7zb84CgpHPnXFxh6-BjQVF1DV
> 6mN3Wj$
> 
> 


Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on Windows by using Intel MPI

2024-03-06 Thread Satish Balay via petsc-users
> make[3]: *** No rule to make target 'w'.  Stop.

Try the following to overcome the above error:

make OMAKE_PRINTDIR=make all

However 3.13.6 is a bit old - so don't know if it will work with these versions 
of compilers.

Satish

On Wed, 6 Mar 2024, 程奔 wrote:

> Hello,
> 
> 
> Last time I  installed PETSc 3.19.2 with Cygwin in Windows10 successfully.
> 
> Recently I try to install PETSc 3.13.6 with Cygwin since I'd like to use 
> PETSc with Visual Studio on Windows10 plateform.For the sake of clarity, I 
> firstly list the softwares/packages used below:
> 
> 1. PETSc: version 3.13.6
> 2. VS: version 2022 
> 3. Intel MPI: download Intel oneAPI Base Toolkit and HPC Toolkit
> 
> 
> 4. Cygwin
> 
> 5. External package: petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And the compiler option in configuration is:
> 
> ./configure  --with-debugging=0  --with-cc='win32fe cl' --with-fc='win32fe 
> ifort' --with-cxx='win32fe cl'  
> 
> --download-fblaslapack=/cygdrive/g/mypetsc/petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
>   --with-shared-libraries=0 
> 
> --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib 
> 
> --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> 
> 
> 
> 
> Then I build PETSc libraries with:
> 
> make PETSC_DIR=/cygdrive/g/mypetsc/petsc-3.13.6 PETSC_ARCH=arch-mswin-c-opt 
> all
> 
> 
> 
> 
> 
> 
> 
> but there return an error:
> 
> **ERROR*
>   Error during compile, check arch-mswin-c-opt/lib/petsc/conf/make.log
>   Send it and arch-mswin-c-opt/lib/petsc/conf/configure.log to 
> petsc-ma...@mcs.anl.gov
> 
> 
> 
> 
> 
> 
> So I wrrit this email to report my problem and ask for your help.  
> 
> 
> Looking forward your reply!
> 
> 
> sinserely,
> Cheng.
> 
> 
> 
> 


Re: [petsc-users] Cannot do make on ex55k

2024-02-14 Thread Satish Balay via petsc-users
Looks like ex55 is the one to use - that links in with ex55k

But it needs a fix for a build from 'make'

>>
balay@petsc-gpu-01:/scratch/balay/petsc/src/snes/tutorials$ git diff
diff --git a/src/snes/tutorials/makefile b/src/snes/tutorials/makefile
index 672a62aa5a0..eed127f7eae 100644
--- a/src/snes/tutorials/makefile
+++ b/src/snes/tutorials/makefile
@@ -8,6 +8,8 @@ CLEANFILES   = ex5f90t
 include ${PETSC_DIR}/lib/petsc/conf/variables
 include ${PETSC_DIR}/lib/petsc/conf/rules
 
+ex55: ex55.o ex55k.o
+
 #-
 
 #  these tests are used by the makefile in PETSC_DIR for basic tests of the 
install and should not be removed
balay@petsc-gpu-01:/scratch/balay/petsc/src/snes/tutorials$ make ex55
mpicc -o ex55.o -c -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas 
-Wno-lto-type-mismatch -Wno-stringop-overflow -fstack-protector 
-fvisibility=hidden -g3 -O0  -I/scratch/balay/petsc/include 
-I/scratch/balay/petsc/arch-linux-c-debug/include -I/usr/local/cuda/include
`pwd`/ex55.c
PATH=`dirname 
/usr/local/cuda/bin/nvcc`:/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/bin:/usr/lib/ccache:/home/balay/.local/bin:/home/balay/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin::/nfs/gce/projects/gce/bin
 NVCC_WRAPPER_DEFAULT_COMPILER="mpicxx" 
/scratch/balay/petsc/arch-linux-c-debug/bin/nvcc_wrapper --expt-extended-lambda 
-c -ccbin mpicxx -std=c++17 -Xcompiler -fPIC -g -lineinfo -gencode 
arch=compute_86,code=sm_86 -I/scratch/balay/petsc/include 
-I/scratch/balay/petsc/arch-linux-c-debug/include -I/usr/local/cuda/include 
-Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas 
-Wno-lto-type-mismatch -Wno-psabi -fstack-protector -g 
-I/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/include -o ex55k.o 
`pwd`/ex55k.kokkos.cxx
mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch 
-Wno-stringop-overflow -fstack-protector -fvisibility=hidden -g3 -O0   
-Wl,-export-dynamic ex55.o ex55k.o  
-Wl,-rpath,/scratch/balay/petsc/arch-linux-c-debug/lib 
-L/scratch/balay/petsc/arch-linux-c-debug/lib 
-Wl,-rpath,/scratch/balay/petsc/arch-linux-c-debug/lib 
-L/scratch/balay/petsc/arch-linux-c-debug/lib -Wl,-rpath,/usr/local/cuda/lib64 
-L/usr/local/cuda/lib64 -L/usr/local/cuda/lib64/stubs 
-Wl,-rpath,/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/lib 
-L/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/lib 
-Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/11 -L/usr/lib/gcc/x86_64-linux-gnu/11 
-lpetsc -lkokkoskernels -lkokkoscontainers -lkokkoscore -lkokkossimd -llapack 
-lblas -lm -lcudart -lnvToolsExt -lcufft -lcublas -lcusparse -lcusolver 
-lcurand -lcuda -lX11 -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s 
-lquadmath -lstdc++ -lquadmath -o ex55
balay@petsc-gpu-01:/scratch/balay/petsc/src/snes/tutorials$ 
<<<

Satish


On Wed, 14 Feb 2024, Uralovich, Ibragimov Iskander wrote:

> Hello!
> 
> I want to start porting part of our PETSc-based code to GPU through kokkos 
> and I want to start with excercises given in tutorials folder.
> 
> For that I found two examples of using PETSC with kokkos in folder 
> petsc/src/snes/tutorials
> 
> I managed to succesfully compile and run ex3k.kokkos.cxx, but I cannot 
> compile ex55k.kokkos.cxx.
> when I do make i got a following error:
> 
> 
> iskander@apollo:~/software/petsc3_20/src/snes/tutorials$ make ex55k
> PATH=`dirname 
> nvcc`:/local/home/iskander/software/petsc3_20/petsc-3.20-deb/bin:/usr/local/cuda-12.2/bin:/local/home/iskander/.vscode-server/bin/05047486b6df5eb8d44b2ecd70ea3bdf775fd937/bin/remote-cli:/local/home/iskander/.local/bin:/local/home/iskander/software/petsc3_20/petsc-3.20-deb/bin:/usr/local/cuda-12.2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/paraview/v5.10.0/bin
>  
> NVCC_WRAPPER_DEFAULT_COMPILER="/local/home/iskander/software/petsc3_20/petsc-3.20-deb/bin"/mpicxx
>  /local/home/iskander/software/petsc3_20/petsc-3.20-deb/bin/nvcc_wrapper 
> --expt-extended-lambda -ccbin 
> /local/home/iskander/software/petsc3_20/petsc-3.20-deb/bin/mpicxx -std=c++17 
> -Xcompiler -fPIC -Xcompiler -fvisibility=hidden -g -lineinfo -gencode 
> arch=compute_80,code=sm_80  
> -I/local/home/iskander/software/petsc3_20/petsc-3.20-deb/include  
> -I/local/home/iskander/software/petsc3_20/include 
> -I/local/home/iskander/software/petsc3_20/arch-linux-c-d
 ebug/inc
 lude -I/local/home/iskander/software/petsc3_20/petsc-3.20-deb/include 
-I/usr/local/cuda-12.2/include-Wall -Wwrite-strings -Wno-strict-aliasing 
-Wno-unknown-pragmas -Wno-lto-type-mismatch -Wno-psabi -fstack-protector 
-fvisibility=hidden -march=znver2
-L/local/home/iskander/software/petsc3_20/petsc-3.20-deb/lib -Wl,-rpath 
-Wl,/local/home/iskander/software/petsc3_20/petsc-3.20-deb/lib 
-Wl,--enable-new-dtags  -lmpi   ex55k.kokkos.cxx  

Re: [petsc-users] PetscSection: Fortran interfaces

2024-02-03 Thread Satish Balay via petsc-users
diff --git a/src/vec/f90-mod/petscvecmod.F90 b/src/vec/f90-mod/petscvecmod.F90
index 4c54fbf63dc..8772f89e135 100644
--- a/src/vec/f90-mod/petscvecmod.F90
+++ b/src/vec/f90-mod/petscvecmod.F90
@@ -163,6 +163,7 @@
 #include <../src/vec/f90-mod/petscvec.h90>
 interface
 #include <../src/vec/f90-mod/ftn-auto-interfaces/petscvec.h90>
+#include <../src/vec/f90-mod/ftn-auto-interfaces/petscpetscsection.h90>
 end interface
 end module


Perhaps there are more generated files that are missing from such listing..

Satish

On Sat, 3 Feb 2024, Martin Diehl wrote:

> thanks for the quick reply
> 
> I think my question was not clear enough:
> With "interfaces" I ment the Fortran module files that ensure correct
> calling signatures.
> 
> In the example (ex26f90), I can replace "ierr" of PetscSectionGetDof
> with anything without getting complaints from the compiler. This does
> not work for other functions, e.g. "ISRestoreIndicesF90" in line 272.
> 
> Martin
>  
> 
> On Sat, 2024-02-03 at 11:51 -0500, Matthew Knepley wrote:
> > Can you give us a simple code that is not working for you?  My test
> > should work
> > 
> >   src/dm/impls/plex/tests/ex26f90
> > 
> >   Thanks,
> > 
> >      Matt
> > 
> > On Sat, Feb 3, 2024 at 11:36 AM Barry Smith  wrote:
> > > 
> > >   The Fortran "stubs" (subroutines) should be in
> > > $PETSC_ARCH/src/vec/is/section/interface/ftn-auto/sectionf.c and
> > > compiled and linked into the PETSc library.
> > > 
> > >   The same tool that builds the interfaces in
> > > $PETSC_ARCH/src/vec/f90-mod/ftn-auto-
> > > interfaces/petscpetscsection.h90,  also builds the stubs so it is
> > > surprising one would exist but not the other.
> > > 
> > >   Barry
> > > 
> > > 
> > > > On Feb 3, 2024, at 11:27 AM, Martin Diehl
> > > >  wrote:
> > > > 
> > > > Dear PETSc team,
> > > > 
> > > > I currently can't make use of Fortran interfaces for "section".
> > > > In particular, I can't see how to use
> > > > 
> > > > PetscSectionGetFieldComponents                                   
> > > >     
> > > > PetscSectionGetFieldDof                                         
> > > >      
> > > > PetscSectionGetFieldOffset
> > > > 
> > > > The interfaces for them are created in $PETSC_ARCH/src/vec/f90-
> > > > mod/ftn-
> > > > auto-interfaces/petscpetscsection.h90, but it seems that they are
> > > > not
> > > > exposed to the public.
> > > > 
> > > > Could you give me a hint how to use them or fix this?
> > > > 
> > > > with best regards,
> > > > Martin
> > > > 
> > > > 
> > > > -- 
> > > > KU Leuven
> > > > Department of Computer Science
> > > > Department of Materials Engineering
> > > > Celestijnenlaan 200a
> > > > 3001 Leuven, Belgium
> > > > 
> > > 
> > 
> > 
> > -- 
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> > their experiments lead.
> > -- Norbert Wiener
> > 
> > https://www.cse.buffalo.edu/~knepley/
> 
> 


Re: [petsc-users] undefined reference to `petsc_allreduce_ct_th'

2024-01-18 Thread Satish Balay via petsc-users


On Thu, 18 Jan 2024, Aaron Scheinberg wrote:

> Hello,
> 
> I'm getting this error when linking:
> 
> undefined reference to `petsc_allreduce_ct_th'
> 
> The instances are regular MPI_Allreduces in my code that are not located in
> parts of the code related to PETSc, so I'm wondering what is happening to
> involve PETSc here? 

This symbol should be in libpetsc.so. Are you including petsc.h - but not 
linking in -lpetsc - from your code?

balay@pj01:~/petsc/arch-linux-c-debug/lib$ nm -Ao libpetsc.so |grep 
petsc_allreduce_ct_th
libpetsc.so:04279a50 B petsc_allreduce_ct_th

> Can I configure it to avoid that? I consulted google,
> the FAQ and skimmed other documentation but didn't see anything. Thanks!

If you wish to avoid petsc logging of MPI messages (but include petsc.h in your 
code?) - you can use in your code:


#define PETSC_HAVE_BROKEN_RECURSIVE_MACRO
#include 


Or build it with -DPETSC_HAVE_BROKEN_RECURSIVE_MACRO compiler option

Satish


Re: [petsc-users] M2 macs

2024-01-09 Thread Satish Balay via petsc-users
The usual xcode/clang + brew/gfortran should work.

https://gitlab.com/petsc/petsc/-/jobs/5895519334
https://gitlab.com/petsc/petsc/-/jobs/5895519414

There can be issues - not all CI builds work in M2 - with latest xcode [when I 
tried this previously] - so some CI jobs are still on Intel/Mac [with older 
xcode]

Satish

On Tue, 9 Jan 2024, Sanjay Govindjee via petsc-users wrote:

> I was wondering if anyone has build experience with PETSc + FORTRAN on an
> M2-based MAC?  In particular, I am looking for compiler recommendations.
> 
> -sanjay
> 
> 


Re: [petsc-users] Configure error while building PETSc with CUDA/MVAPICH2-GDR

2023-12-08 Thread Satish Balay via petsc-users
Executing: mpicc -show
stdout: icc -I/opt/apps/cuda/11.4/include -I/opt/apps/cuda/11.4/include -lcuda 
-L/opt/apps/cuda/11.4/lib64/stubs -L/opt/apps/cuda/11.4/lib64 -lcudart -lrt 
-Wl,-rpath,/opt/apps/cuda/11.4/lib64 -Wl,-rpath,XORIGIN/placeholder 
-Wl,--build-id -L/opt/apps/cuda/11.4/lib64/ -lm 
-I/opt/apps/intel19/mvapich2-gdr/2.3.7/include 
-L/opt/apps/intel19/mvapich2-gdr/2.3.7/lib64 -Wl,-rpath 
-Wl,/opt/apps/intel19/mvapich2-gdr/2.3.7/lib64 -Wl,--enable-new-dtags -lmpi

Checking for program /opt/apps/cuda/12.0/bin/nvcc...found

Looks like you are trying to mix in 2 different cuda versions in this build.

Perhaps you need to use cuda-11.4 - with this install of mvapich..

Satish

On Fri, 8 Dec 2023, Matthew Knepley wrote:

> On Fri, Dec 8, 2023 at 1:54 PM Sreeram R Venkat  wrote:
> 
> > I am trying to build PETSc with CUDA using the CUDA-Aware MVAPICH2-GDR.
> >
> > Here is my configure command:
> >
> > ./configure PETSC_ARCH=linux-c-debug-mvapich2-gdr --download-hypre
> >  --with-cuda=true --cuda-dir=$TACC_CUDA_DIR --with-hdf5=true
> > --with-hdf5-dir=$TACC_PHDF5_DIR --download-elemental --download-metis
> > --download-parmetis --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> >
> > which errors with:
> >
> >   UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for
> > details):
> >
> > -
> >   CUDA compile failed with arch flags " -ccbin mpic++ -std=c++14
> > -Xcompiler -fPIC
> >   -Xcompiler -fvisibility=hidden -g -lineinfo -gencode
> > arch=compute_80,code=sm_80"
> >   generated from "--with-cuda-arch=80"
> >
> >
> >
> > The same configure command works when I use the Intel MPI and I can build
> > with CUDA. The full config.log file is attached. Please let me know if you
> > need any other information. I appreciate your help with this.
> >
> 
> The proximate error is
> 
> Executing: nvcc -c -o /tmp/petsc-kn3f29gl/config.packages.cuda/conftest.o
> -I/tmp/petsc-kn3f29gl/config.setCompilers
> -I/tmp/petsc-kn3f29gl/config.types
> -I/tmp/petsc-kn3f29gl/config.packages.cuda  -ccbin mpic++ -std=c++14
> -Xcompiler -fPIC -Xcompiler -fvisibility=hidden -g -lineinfo -gencode
> arch=compute_80,code=sm_80  /tmp/petsc-kn3f29gl/config.packages.cuda/
> conftest.cu
> stdout:
> /opt/apps/cuda/11.4/include/crt/sm_80_rt.hpp(141): error: more than one
> instance of overloaded function "__nv_associate_access_property_impl" has
> "C" linkage
> 1 error detected in the compilation of
> "/tmp/petsc-kn3f29gl/config.packages.cuda/conftest.cu".
> Possible ERROR while running compiler: exit code 1
> stderr:
> /opt/apps/cuda/11.4/include/crt/sm_80_rt.hpp(141): error: more than one
> instance of overloaded function "__nv_associate_access_property_impl" has
> "C" linkage
> 
> 1 error detected in the compilation of
> "/tmp/petsc-kn3f29gl/config.packages.cuda
> 
> This looks like screwed up headers to me, but I will let someone that
> understands CUDA compilation reply.
> 
>   Thanks,
> 
>  Matt
> 
> Thanks,
> > Sreeram
> >
> 
> 
> 

Re: [petsc-users] petsc build could not pass make check

2023-11-29 Thread Satish Balay via petsc-users
Do you have a ~/.petscrc file - with -log_view enabled?

Satish

On Wed, 29 Nov 2023, Di Miao via petsc-users wrote:

> Hi,
> 
> I tried to compile PETSc with the following configuration:
> 
> ./configure --with-debugging=0 COPTFLAGS='-O3' CXXOPTFLAGS='-O3' 
> FOPTFLAGS='-O3' --with-clean=1 
> --with-make-exec=/SCRATCH/dimiao/test_space/installed/make/bin/make 
> --with-cmake-exec=/SCRATCH/dimiao/test_space/cmake-3.27.9-linux-x86_64/bin/cmake
>  --prefix=/SCRATCH/dimiao/test_space/installed/petsc_opt_mpi 
> --with-mpi-dir=/SCRATCH/dimiao/test_space/installed/mpich 
> PETSC_ARCH=petsc_opt_mpi 
> --with-blaslapack-dir=/SCRATCH/dimiao/oneapi/mkl/latest 
> --with-mkl_pardiso-dir=/SCRATCH/dimiao/oneapi/mkl/latest --with-x=0
> 
> I got three errors:
> Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
> Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
> Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI 
> process
> 
> Below each error messages are nothing PETSc's performance summary.
> 
> I have attached make.log, configure.log and the message from make 
> check(make_check.log). Could you please give me some guidance on how to fix 
> this issue?
> 
> Thank you,
> Di
> 
> 
> 



Re: [petsc-users] Segmentation Violation error using SuperLU_DIST in ex 19.c

2023-11-24 Thread Satish Balay via petsc-users
Do you really need this combination of pkgs?

Matlab is distributed with ILP64 MKL - so it doesn't really work with
LP64 blas/lapack that most external packages require - i.e you can't
really use use matlab and other external-packages.

[also it might not work with complex]

To get a successful matlab build - try:

./configure PETSC_ARCH=arch-linux-matlab --download-mpich 
--with-matlab-dir=/usr/local/matlab --with-matlab-engine=1 
--with-blaslapack-dir=/usr/local/matlab --known-64-bit-blas-indices=1

Satish

On Fri, 24 Nov 2023, maitri ksh wrote:

> Hi Satish,
> Yes, that simple build works with no error. I configured petsc again with
> the configure options:
> 
>  PETSC_ARCH=linux-gnu-c-debug -start-in-debugger[noxterm]
> --with-cc=/usr/local/gcc11/bin/gcc --with-cxx=/usr/local/gcc11/bin/g++
> --with-fc=gfortran --with-debugging=1 --with-scalar-type=complex
> --download-mpich --download-fblaslapack --with-matlab-dir=/usr/local/matlab
> --download-superlu --download-superlu_dist --download-hdf5 --download-mumps
> --download-scalapack --download--parmetis --download-metis
> --download-ptscotch --download-bison --download-cmake --download-make
> 
> Now, it runs the superLU_dist test successfully but it gives an error with
> MATLAB engine 'Possible error running C/C++ src/vec/vec/tutorials/ex31 with
> MATLAB engine' and also an error with MAKE check.
> 
> 
> 
> On Thu, Nov 23, 2023 at 10:26 PM Satish Balay  wrote:
> 
> > Can you do a simple build with only superlu-dist and see if the error
> > persists?
> >
> > ./configure PETSC_ARCH=linux-slu --with-cc=/usr/local/gcc11/bin/gcc
> > --with-cxx=/usr/local/gcc11/bin/g++ --with-fc=gfortran --with-debugging=1
> > --with-scalar-type=complex --download-mpich --download-fblaslapack
> > --download-superlu_dist
> > make
> > make PETSC_ARCH=linux-slu check
> >
> > Satish
> >
> > On Thu, 23 Nov 2023, maitri ksh wrote:
> >
> > > Hi,
> > > I ran into an error while using SuperLU_DIST in ex 19.c, I am not sure
> > how
> > > to debug this, can anyone please help. The 'configure.log' file is
> > attached
> > > for your reference.
> > > Thanks,
> > > Maitri
> > >
> >
> >
> 


Re: [petsc-users] Segmentation Violation error using SuperLU_DIST in ex 19.c

2023-11-23 Thread Satish Balay via petsc-users
Can you do a simple build with only superlu-dist and see if the error persists?

./configure PETSC_ARCH=linux-slu --with-cc=/usr/local/gcc11/bin/gcc 
--with-cxx=/usr/local/gcc11/bin/g++ --with-fc=gfortran --with-debugging=1 
--with-scalar-type=complex --download-mpich --download-fblaslapack 
--download-superlu_dist
make
make PETSC_ARCH=linux-slu check

Satish

On Thu, 23 Nov 2023, maitri ksh wrote:

> Hi,
> I ran into an error while using SuperLU_DIST in ex 19.c, I am not sure how
> to debug this, can anyone please help. The 'configure.log' file is attached
> for your reference.
> Thanks,
> Maitri
> 



Re: [petsc-users] Difficulty installing PETSC-3.17.0 on new macOS Sonoma

2023-11-20 Thread Satish Balay via petsc-users
replied on petsc-maint

xcode-15 changed considerably, (fixes are in petsc-3.20)  that its not easy to 
backport all needed patches to 3.17.0

So best bet for petsc-3.19 and older is to use linux (remotely or via VM) - or 
downgrade to xcode-14.

Satish

On Mon, 20 Nov 2023, Jan Izak C. Vermaak via petsc-users wrote:

> Hi all,
> 
> I am in the process of upgrading our petsc version to 3.20.1 but I really 
> need our current version to work with 3.17.0.
> 
> I am having install issues. Attached is the config log. Command Line Tools 
> 15.0 (CLT 15.0) used to be the source of my problems with the previous OS 
> version for which the solution was to install the old CLT 14.3, however, CLT 
> 14.3 is not compatible with macOS Sonoma (which INL is requiring us to have).
> 
> Any help will be appreciated.
> 
> Regards,
> Jan
> 
> Jan Vermaak, Ph.D.
> Senior Nuclear Multiphysics Engineer   |   Reactor Physics Methods and 
> Analysis Department (C110)
> Reactor Systems Design and Analysis Division  |  Nuclear Science & Technology 
> Directorate
> janizak.verm...@inl.gov   |   M  979-739-0789
> Idaho National Laboratory   |   1955 Fremont Ave.   |   Idaho Falls, ID   |   
> 83415
> ___
> 
> [signature_1025312815]
> 
> 



Re: [petsc-users] error in configuring PETSc

2023-11-08 Thread Satish Balay via petsc-users
Suggest attaching text logs (copy/paste) - instead of screenshots.

Try:

./configure --with-cc=gcc-11 --with-cxx=g++-11 --with-fc=gfortran-11 
--download-fftw --download-openmpi --download-fblaslapack --with-zlibs=yes 
--with-szlib=no --with-c2html=0 --with-x=0 --download-hdf5-fortran-bindings=1 
--download-hdf5 --download-sowing-cxx=g++-11

If you still have issues - send configure.log for this failure

Satish

On Thu, 9 Nov 2023, 张胜 wrote:

> Dear PETSc developer,
> 
> I use the following commands to configure petsc, but errors occur: 
> ./configure --with-cc=gcc-11 --with-cxx=g++-11 --with-fc=gfortran-11 
> --download-fftw --download-openmpi --download-fblaslapack 
> --free-line-length-0 -g -fallow-argument-mismatch --enable-shared 
> --enable-parallel --enable-fortran --with-zlibs=yes --with-szlib=no 
> --with-cxx-dialect=C++11 --with-c2html=0 --with-x=0 
> --download-hdf5-fortran-bindings=1 --download-hdf5
> 
> 
> I tried many times but cannot fix it. So I ask help for you. Thanks in 
> advance. 
> Best regards, 
> Sheng Zhang 
> 
> Ph.D 
> School of Materials Science and Engineering 
> Shanghai Jiao Tong University 
> 800 Dongchuan Road 
> Shanghai, 200240 China
> 


Re: [petsc-users] OpenMP doesn't work anymore with PETSc building rules

2023-10-25 Thread Satish Balay via petsc-users
I guess the flag you are looking for is CUDAFLAGS

>>>
balay@petsc-gpu-01:/scratch/balay/petsc/src/vec/vec/tests$ make ex100 
CUDAFLAGS="-Xcompiler -fopenmp" LDFLAGS=-fopenmp
/usr/local/cuda/bin/nvcc -o ex100.o -c 
-I/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/include  -ccbin mpicxx 
-std=c++17 -Xcompiler -fPIC -Xcompiler -fvisibility=hidden -g -lineinfo 
-gencode arch=compute_86,code=sm_86  -Xcompiler -fopenmp
-I/scratch/balay/petsc/include 
-I/scratch/balay/petsc/arch-linux-c-debug/include -I/usr/local/cuda/include
`pwd`/ex100.cu
mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch 
-Wno-stringop-overflow -fstack-protector -fvisibility=hidden -g3 -O0  -fopenmp 
-Wl,-export-dynamic ex100.o  
-Wl,-rpath,/scratch/balay/petsc/arch-linux-c-debug/lib 
-L/scratch/balay/petsc/arch-linux-c-debug/lib -Wl,-rpath,/usr/local/cuda/lib64 
-L/usr/local/cuda/lib64 -L/usr/local/cuda/lib64/stubs 
-Wl,-rpath,/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/lib 
-L/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/lib 
-Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/11 -L/usr/lib/gcc/x86_64-linux-gnu/11 
-lpetsc -llapack -lblas -lm -lcudart -lnvToolsExt -lcufft -lcublas -lcusparse 
-lcusolver -lcurand -lcuda -lX11 -lmpifort -lmpi -lgfortran -lm -lgfortran -lm 
-lgcc_s -lquadmath -lstdc++ -lquadmath -o ex100
rm ex100.o
balay@petsc-gpu-01:/scratch/balay/petsc/src/vec/vec/tests$ 
<<<

Satish

On Wed, 25 Oct 2023, Qiyue Lu wrote:

> Even with
> CXXFLAGS=-Xcompiler -fopenmp -std=c++17
> LDFLAGS= -Xcompiler -fopenmp
> CXXPPFLAGS=-I/u/qiyuelu1/cuda/cuda-samples/Common
> include ${PETSC_DIR}/lib/petsc/conf/variables
> include ${PETSC_DIR}/lib/petsc/conf/rules
> 
> won't work.
> 
> On Wed, Oct 25, 2023 at 11:06 AM Qiyue Lu  wrote:
> 
> > Thanks for your reply, using this configurations:
> >
> > *--with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> > --download-f2cblaslapack=1 --with-cudac=nvcc --with-cuda=1 --with-openmp=1
> > --with-threadsafety=1*
> > However, I got an error like:
> > *nvcc fatal   : Unknown option '-fopenmp'*
> > Previously, when I don't have --with-openmp for the configuration, the
> > PETSc make system can build my *.cu code using nvcc and g++, of course,
> > OpenMP doesn't work. Now with this --with-openmp option, it cannot even
> > build. The interesting thing is, I got this error even after removing the
> > *-fopenmp* from *CXXFLAGS* contents:
> > CXXFLAGS=-std=c++17
> > LDFLAGS=
> > CXXPPFLAGS=-I/u/qiyuelu1/cuda/cuda-samples/Common
> > include ${PETSC_DIR}/lib/petsc/conf/variables
> > include ${PETSC_DIR}/lib/petsc/conf/rules
> >
> >
> >
> > Thanks,
> > Qiyue Lu
> >
> > On Wed, Oct 25, 2023 at 10:54 AM Satish Balay  wrote:
> >
> >>
> >> On Wed, 25 Oct 2023, Qiyue Lu wrote:
> >>
> >> > Hello,
> >> > I have an in-house code enabled OpenMP and it works. Now I am trying to
> >> > incorporate PETSc as the linear solver and build together using the
> >> > building rules in $PETSC_HOME/lib/petsc/conf/rules. However, I found the
> >> > OpenMP part doesn't work anymore.
> >>
> >> If you are looking at building only your sources with openmp - using
> >> petsc formatted makefile [using petsc build rules],
> >> you can specify it via CFLAGS - either in makefile - or on command line.
> >>
> >> >>>
> >> For ex: [this example is using src/ksp/ksp/tutorials/makefile - with the
> >> corresponding make fules]
> >>
> >> [balay@pj01 tutorials]$ make ex2
> >> mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas
> >> -Wno-lto-type-mismatch -Wno-stringop-overflow -fstack-protector
> >> -fvisibility=hidden -g3 -O0  -I/home/balay/petsc/include
> >> -I/home/balay/petsc/arch-linux-c-debug/include -Wl,-export-dynamic
> >> ex2.c  -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib
> >> -L/home/balay/petsc/arch-linux-c-debug/lib
> >> -Wl,-rpath,/software/mpich-4.1.1/lib -L/software/mpich-4.1.1/lib
> >> -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/13
> >> -L/usr/lib/gcc/x86_64-redhat-linux/13 -lpetsc -llapack -lblas -lm -lX11
> >> -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++
> >> -lquadmath -o ex2
> >> [balay@pj01 tutorials]$ make clean
> >> [balay@pj01 tutorials]$ make ex2 CFLAGS=-fopenmp
> >> mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas
> >> -Wno-lto-type-mismatch -Wno-stringop-overflow -fstack-protector
> >> -fvisibility=hidden -g3 -O0 -fopenmp -I/home/balay/petsc/include
> >> -I/home/balay/petsc/arch-linux-c-debug/include -Wl,-export-dynamic
> >> ex2.c  -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib
> >> -L/home/balay/petsc/arch-linux-c-debug/lib
> >> -Wl,-rpath,/software/mpich-4.1.1/lib -L/software/mpich-4.1.1/lib
> >> -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/13
> >> -L/usr/lib/gcc/x86_64-redhat-linux/13 -lpetsc -llapack -lblas -lm -lX11
> >> -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++
> >> -lquadmath -o ex2
> >> [balay@pj01 tutorials]$
> >> <
> >>
> >> Satish
> >>
> >>
> >> > Should I re-configure the petsc 

Re: [petsc-users] OpenMP doesn't work anymore with PETSc building rules

2023-10-25 Thread Satish Balay via petsc-users


On Wed, 25 Oct 2023, Qiyue Lu wrote:

> Hello,
> I have an in-house code enabled OpenMP and it works. Now I am trying to
> incorporate PETSc as the linear solver and build together using the
> building rules in $PETSC_HOME/lib/petsc/conf/rules. However, I found the
> OpenMP part doesn't work anymore.

If you are looking at building only your sources with openmp - using petsc 
formatted makefile [using petsc build rules],
you can specify it via CFLAGS - either in makefile - or on command line.

>>>
For ex: [this example is using src/ksp/ksp/tutorials/makefile - with the 
corresponding make fules]

[balay@pj01 tutorials]$ make ex2
mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch 
-Wno-stringop-overflow -fstack-protector -fvisibility=hidden -g3 -O0  
-I/home/balay/petsc/include -I/home/balay/petsc/arch-linux-c-debug/include 
-Wl,-export-dynamic ex2.c  -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib 
-L/home/balay/petsc/arch-linux-c-debug/lib -Wl,-rpath,/software/mpich-4.1.1/lib 
-L/software/mpich-4.1.1/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/13 
-L/usr/lib/gcc/x86_64-redhat-linux/13 -lpetsc -llapack -lblas -lm -lX11 
-lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ 
-lquadmath -o ex2
[balay@pj01 tutorials]$ make clean
[balay@pj01 tutorials]$ make ex2 CFLAGS=-fopenmp
mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch 
-Wno-stringop-overflow -fstack-protector -fvisibility=hidden -g3 -O0 -fopenmp 
-I/home/balay/petsc/include -I/home/balay/petsc/arch-linux-c-debug/include 
-Wl,-export-dynamic ex2.c  -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib 
-L/home/balay/petsc/arch-linux-c-debug/lib -Wl,-rpath,/software/mpich-4.1.1/lib 
-L/software/mpich-4.1.1/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/13 
-L/usr/lib/gcc/x86_64-redhat-linux/13 -lpetsc -llapack -lblas -lm -lX11 
-lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ 
-lquadmath -o ex2
[balay@pj01 tutorials]$ 
<

Satish


> Should I re-configure the petsc installation with --with-openmp=1 option? I
> wonder are the building rules affected by this missing option?
> 
> Thanks,
> Qiyue Lu
> 



Re: [petsc-users] use MATSEQAIJMKL in 64-bit indices

2023-10-20 Thread Satish Balay via petsc-users
Try using the additional option --with-64-bit-blas-indices=1

Satish

On Fri, 20 Oct 2023, Di Miao wrote:

> Hi,
> 
> I found that when compiled with '--with-64-bit-indices=1' option, the
> following three definitions in petscconf.h will be removed:
> 
> #define PETSC_HAVE_MKL_SPARSE 1
> #define PETSC_HAVE_MKL_SPARSE_OPTIMIZE 1
> #define PETSC_HAVE_MKL_SPARSE_SP2M_FEATURE 1
> 
> I believe mkl can also use 64-bit indices (libmkl_intel_ilp64). I tried to
> add ' --with-mkl_sparse=1 --with-mkl_sparse_optimize=1' into configuration
> but does not succeed.
> 
> Would I know if it is possible to use MATSEQAIJMKL matrix type in 64-bit
> mode?
> 
> Regards,
> Di
> 



Re: [petsc-users] Error when installing PETSc

2023-10-18 Thread Satish Balay via petsc-users
> Working directory: /home/tt/petsc-3.16.0

use latest petsc release - 3.20

> --with-fc=flang

I don't think this ever worked. Use --with-fc=gfortran instead

/opt/ohpc/pub/spack/opt/spack/linux-centos7-skylake_avx512/gcc-8.3.0/m4-1.4.19-lwqcw3hzoxoia5q6nzolylxaf5zevluk/bin/m4:
 internal error detected; please report this bug to : Illegal 
instruction

You might need to report this to your admin who installed this spack package.

They might need to rebuild spack for 'x86_64' instead of 'skylake_avx512'

Or use a different m4 - say from /usr/bin - if you have it there.

Satish

On Wed, 18 Oct 2023, Matthew Knepley wrote:

> On Wed, Oct 18, 2023 at 6:07 AM Gong Yujie 
> wrote:
> 
> > Dear PETSc developers,
> >
> > I got an error message when installing PETSc with a clang compiler. Could
> > you please help me find the problem? The configure.log is attached.
> >
> 
> Your compiler segfaulted when compiling OpenMPI:
> 
> Making all in mca/crs
> make[2]: Entering directory
> '/home/tt/petsc-3.16.0/optamd/externalpackages/openmpi-4.1.0/opal/mca/crs'
>   GENERATE opal_crs.7
>   CC   base/crs_base_open.lo
>   CC   base/crs_base_close.lo
>   CC   base/crs_base_select.lo
>   CC   base/crs_base_fns.lo
> make[2]: Leaving directory
> '/home/tt/petsc-3.16.0/optamd/externalpackages/openmpi-4.1.0/opal/mca/crs'
> make[1]: Leaving directory
> '/home/tt/petsc-3.16.0/optamd/externalpackages/openmpi-4.1.0/opal'/bin/sh:
> line 7:  6327 Illegal instruction (core dumped) ../../../config/
> make_manpage.pl --package-name='Open MPI' --package-version='4.1.0'
> --ompi-date='Dec 18, 2020' --opal-date='Dec 18, 2020' --orte-date='Dec 18,
> 2020' --input=opal_crs.7in --output=opal_crs.7
> make[2]: *** [Makefile:2215: opal_crs.7] Error 132
> make[2]: *** Waiting for unfinished jobs
> make[1]: *** [Makefile:2383: all-recursive] Error 1
> make: *** [Makefile:1901: all-recursive] Error 1
> 
> I suggest compiling MPICH instead.
> 
>   Thanks,
> 
>  Matt
> 
> 
> > Best Regards,
> > Yujie
> >
> > Here is the detail of the error:
> >
> > =
> >
> >   Configuring PETSc to compile on your system
> >
> > =
> > =
> >
> > * WARNING: Using default optimization C flags -g -O3
> > You might consider manually setting optimal optimization flags for your
> > system with
> > COPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for
> > examples
> > =
> >
> > =
> >
> > * WARNING: Using default Cxx optimization flags -g -O3
> >
> > You might consider manually setting optimal optimization flags for your
> > system with
> > CXXOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for
> > examples
> > =
> >
> > =
> >
> > * WARNING: Using default FORTRAN optimization flags -O
> >
> > You might consider manually setting optimal optimization flags for your
> > system with
> > FOPTFLAGS="optimization flags" see config/examples/arch-*-opt.py for
> > examples
> > =
> >
> > =
> >
> > Trying to download
> > https://download.open-mpi.org/release/open-mpi/v4.1/openmpi-4.1.0.tar.gz
> > for
> > =
> >
> > =
> >
> > Running configure on OPENMPI; this may take several minutes
> >
> > =
> >
> > =
> >
> > Running make on OPENMPI; this may take several minutes
> >
> > =
> >
> >
> > ***
> >  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
> > details):
> >
> > ---
> > Error running make; make install on OPENMPI
> >
> > ***
> >
> >
> 
> 


Re: [petsc-users] Using Sundials from PETSc

2023-10-16 Thread Satish Balay via petsc-users
I'll note - current sundials release has some interfaces to petsc functionality

Satish

On Mon, 16 Oct 2023, Matthew Knepley wrote:

> On Mon, Oct 16, 2023 at 2:29 PM Vanella, Marcos (Fed) via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
> 
> > Hi, we were wondering if it would be possible to call the latest version
> > of Sundials from PETSc?
> >
> 
> The short answer is, no. We are at v2.5 and they are at v6.5. There were no
> dates on the version history page, so I do not know how out of date we are.
> There have not been any requests for update until now.
> 
> We would be happy to get an MR for the updates if you want to try it.
> 
> 
> > We are interested in doing chemistry using GPUs and already have
> > interfaces to PETSc from our code.
> >
> 
> How does the GPU interest interact with the SUNDIALS version?
> 
>   Thanks,
> 
>  Matt
> 
> 
> > Thanks,
> > Marcos
> >
> 
> 
> 


Re: [petsc-users] Configuration of PETSc with Intel OneAPI and Intel MPI fails

2023-10-11 Thread Satish Balay via petsc-users
The same docs should be available in 
https://web.cels.anl.gov/projects/petsc/download/release-snapshots/petsc-with-docs-3.20.0.tar.gz

Satish

On Wed, 11 Oct 2023, Richter, Roland wrote:

> Hei,
> Thank you very much for the answer! I looked it up, but petsc.org seems to
> be a bit unstable here, quite often I can't reach petsc.org. 
> Regards,
> Roland Richter
> 
> -Ursprüngliche Nachricht-
> Von: Satish Balay  
> Gesendet: mandag 9. oktober 2023 17:29
> An: Barry Smith 
> Cc: Richter, Roland ; petsc-users@mcs.anl.gov
> Betreff: Re: [petsc-users] Configuration of PETSc with Intel OneAPI and
> Intel MPI fails
> 
> Will note - OneAPI MPI usage is documented at
> https://petsc.org/release/install/install/#mpi
> 
> Satish
> 
> On Mon, 9 Oct 2023, Barry Smith wrote:
> 
> > 
> >   Instead of using the mpiicc -cc=icx style use -- with-cc=mpiicc (etc)
> and 
> > 
> > export I_MPI_CC=icx
> > export I_MPI_CXX=icpx
> > export I_MPI_F90=ifx
> > 
> > 
> > > On Oct 9, 2023, at 8:32 AM, Richter, Roland 
> wrote:
> > > 
> > > Hei,
> > > I'm currently trying to install PETSc on a server (Ubuntu 22.04) with
> Intel MPI and Intel OneAPI. To combine both, I have to use f. ex. "mpiicc
> -cc=icx" as C-compiler, as described by
> https://stackoverflow.com/a/76362396. Therefore, I adapted the
> configure-line as follow:
> > >  
> > > ./configure --prefix=/media/storage/local_opt/petsc
> --with-scalar-type=complex --with-cc="mpiicc -cc=icx" --with-cxx="mpiicpc
> -cxx=icpx" --CPPFLAGS="-fPIC -march=native -mavx2" --CXXFLAGS="-fPIC
> -march=native -mavx2" --with-fc="mpiifort -fc=ifx" --with-pic=true
> --with-mpi=true
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/latest/lib/intel64/
> --with-openmp=true --download-hdf5=yes --download-netcdf=yes
> --download-chaco=no --download-metis=yes --download-slepc=yes
> --download-suitesparse=yes --download-eigen=yes --download-parmetis=yes
> --download-ptscotch=yes --download-mumps=yes --download-scalapack=yes
> --download-superlu=yes --download-superlu_dist=yes --with-mkl_pardiso=1
> --with-boost=1 --with-boost-dir=/media/storage/local_opt/boost
> --download-opencascade=yes --with-fftw=1
> --with-fftw-dir=/media/storage/local_opt/fftw3 --download-kokkos=yes
> --with-mkl_sparse=1 --with-mkl_cpardiso=1 --with-mkl_sparse_optimize=1
> --download-muparser=no --download-p4est=yes --download-sowing=y
>  es --download-viennalcl=yes --with-zlib --force=1 --with-clean=1
> --with-cuda=1
> > >  
> > > The configuration, however, fails with 
> > >  
> > > The CMAKE_C_COMPILER:
> > >  
> > > mpiicc -cc=icx
> > >  
> > >   is not a full path and was not found in the PATH
> > >  
> > > for all additional modules which use a cmake-based configuration
> approach (such as OPENCASCADE). How could I solve that problem?
> > >  
> > > Thank you!
> > > Regards,
> > > Roland Richter
> > > 
> > 
> > 
> 


Re: [petsc-users] Configuration of PETSc with Intel OneAPI and Intel MPI fails

2023-10-09 Thread Satish Balay via petsc-users
Will note - OneAPI MPI usage is documented at 
https://petsc.org/release/install/install/#mpi

Satish

On Mon, 9 Oct 2023, Barry Smith wrote:

> 
>   Instead of using the mpiicc -cc=icx style use -- with-cc=mpiicc (etc) and 
> 
> export I_MPI_CC=icx
> export I_MPI_CXX=icpx
> export I_MPI_F90=ifx
> 
> 
> > On Oct 9, 2023, at 8:32 AM, Richter, Roland  wrote:
> > 
> > Hei,
> > I'm currently trying to install PETSc on a server (Ubuntu 22.04) with Intel 
> > MPI and Intel OneAPI. To combine both, I have to use f. ex. "mpiicc 
> > -cc=icx" as C-compiler, as described by 
> > https://stackoverflow.com/a/76362396. Therefore, I adapted the 
> > configure-line as follow:
> >  
> > ./configure --prefix=/media/storage/local_opt/petsc 
> > --with-scalar-type=complex --with-cc="mpiicc -cc=icx" --with-cxx="mpiicpc 
> > -cxx=icpx" --CPPFLAGS="-fPIC -march=native -mavx2" --CXXFLAGS="-fPIC 
> > -march=native -mavx2" --with-fc="mpiifort -fc=ifx" --with-pic=true 
> > --with-mpi=true 
> > --with-blaslapack-dir=/opt/intel/oneapi/mkl/latest/lib/intel64/ 
> > --with-openmp=true --download-hdf5=yes --download-netcdf=yes 
> > --download-chaco=no --download-metis=yes --download-slepc=yes 
> > --download-suitesparse=yes --download-eigen=yes --download-parmetis=yes 
> > --download-ptscotch=yes --download-mumps=yes --download-scalapack=yes 
> > --download-superlu=yes --download-superlu_dist=yes --with-mkl_pardiso=1 
> > --with-boost=1 --with-boost-dir=/media/storage/local_opt/boost 
> > --download-opencascade=yes --with-fftw=1 
> > --with-fftw-dir=/media/storage/local_opt/fftw3 --download-kokkos=yes 
> > --with-mkl_sparse=1 --with-mkl_cpardiso=1 --with-mkl_sparse_optimize=1 
> > --download-muparser=no --download-p4est=yes --download-sowing=y
 es --dow
 nload-viennalcl=yes --with-zlib --force=1 --with-clean=1 --with-cuda=1
> >  
> > The configuration, however, fails with 
> >  
> > The CMAKE_C_COMPILER:
> >  
> > mpiicc -cc=icx
> >  
> >   is not a full path and was not found in the PATH
> >  
> > for all additional modules which use a cmake-based configuration approach 
> > (such as OPENCASCADE). How could I solve that problem?
> >  
> > Thank you!
> > Regards,
> > Roland Richter
> > 
> 
> 


Re: [petsc-users] 'nvcc -show' Error for configure with NVCC

2023-10-06 Thread Satish Balay via petsc-users



On Fri, 6 Oct 2023, Qiyue Lu wrote:

> Hello,
> I am trying to configure PETSc(current release version) with NVCC, with
> these options:
> ./configure --with-cc=nvcc --with-cxx=nvcc --with-fc=0 --with-cuda=1

this usage is incorrect. You need:

--with-cc=mpicc --with-cxx=mpicxx --with-cudac=nvcc --with-cuda=1

Satish

> 
> However, I got error like:
> -
>   Could not execute "['nvcc -show']":
>   nvcc fatal   : Unknown option '-show'
> *
> 
> I wonder where this -show option comes from? It seems safe to disable this
> option.
> 
> Thanks,
> Qiyue Lu
> 



Re: [petsc-users] Error when configure cmake

2023-09-29 Thread Satish Balay via petsc-users
Here --download-cmake is failing [due to the old version of c++ compiler]. You 
can try installing an older version manually instead of --download-cmake

Or you might be able to install a newer gcc/g++ easily [if its not already 
installed on your machine]. For ex:

git clone https://github.com/spack/spack/
cd spack
./bin/spack install gcc@7.5.0

Satish

On Sat, 30 Sep 2023, Pierre Jolivet wrote:

> You are using g++ (GCC) 4.4.7 20120313 (Red Hat 4.4.7-23).
> You need to use a less ancient C++ compiler.
> Please send logs to petsc-maint, not petsc-users.
> 
> Thanks,
> Pierre
> 
> > On 30 Sep 2023, at 7:00 AM, Ivan Luthfi  wrote:
> > 
> > I am trying to configure my pets-3.13.6, but I have an error when running 
> > configure on cmake. Please help me on this. Here is the following 
> > configure.log.
> > 
> > 
> 


Re: [petsc-users] Issue when configuring my petsc-3.20.0

2023-09-29 Thread Satish Balay via petsc-users
Can you send us the complete configure.log file [as attachment] - perhaps to 
petsc-ma...@mcs.anl.gov

Satish

On Sat, 30 Sep 2023, Ivan Luthfi wrote:

> Hi team,
> I have an issue when I configure my petsc with the following script:
> 
> ./configure --with-fc=0 --download-f2cblaslapack=1 
> --with-blaslapack-lib=libsunperf.a --with-blas-lib=libblas.a 
> --with-lapack-lib=liblapack.a --download-make --with-cc=gcc 
> --with-cxx=/home/ivanluthfi1/openmpi/opt-4.1.1/bin/mpicxx --download-cmake 
> --download-superlu_dist --with-superlud-lib=libsuperlu_dist_3.3.a 
> --download-parmetis --with-parmetis-lib=libparmetis.a 
> --with-cc=/home/ivanluthfi1/openmpi/opt-4.1.1/bin/mpicc --download-metis 
> --with-metis-lib=libmetis.a
> 
>  Here is the configure.log message:
> 
>  PETSc Error: No output file produced
>   Rejecting compiler flag -std=c++11  due to nonzero status from 
> link
> *
>UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> details):
> -
>   Using C++ dialect C++11 as lower bound due to package(s):
>   - SuperLU_DIST
>   But C++ compiler (/home/ivanluthfi1/openmpi/opt-4.1.1/bin/mpicxx) appears 
> non-compliant
>   with C++11 or didn't accept:
>   - -std=gnu++20
>   - -std=c++20
>   - -std=gnu++17
>   - -std=c++17
>   - -std=gnu++14
>   - -std=c++14
>   - -std=gnu++11
>   - -std=c++11
> *
>   File "/home/ivanluthfi1/petsc/config/configure.py", line 462, in 
> petsc_configure
> framework.configure(out = sys.stdout)
>   File "/home/ivanluthfi1/petsc/config/BuildSystem/config/framework.py", line 
> 1447, in configure
> self.processChildren()
>   File "/home/ivanluthfi1/petsc/config/BuildSystem/config/framework.py", line 
> 1435, in processChildren
> self.serialEvaluation(self.childGraph)
>   File "/home/ivanluthfi1/petsc/config/BuildSystem/config/framework.py", line 
> 1410, in serialEvaluation
> child.configure()
>   File "/home/ivanluthfi1/petsc/config/BuildSystem/config/setCompilers.py", 
> line 2786, in configure
> 
> self.executeTest(self.checkCxxDialect,args=[LANG],kargs={'isGNUish':isGNUish})
>   File "/home/ivanluthfi1/petsc/config/BuildSystem/config/base.py", line 138, 
> in executeTest
> ret = test(*args,**kargs)
>   File "/home/ivanluthfi1/petsc/config/BuildSystem/config/setCompilers.py", 
> line 1225, in checkCxxDialect
> raise ConfigureSetupError(mess)
> 
> Finishing configure run at Sat, 30 Sep 2023 00:52:35 +0800
> 
> Please help me, I need to configure it with metis, parmetis, superlud, blas, 
> and lapack. 
> 
> 



Re: [petsc-users] Xcode 15.0 breaks PETSc configure?

2023-09-28 Thread Satish Balay via petsc-users
petsc git repo main branch has fixes for xcode-15. Can you give it a try?

Satish

On Thu, 28 Sep 2023, Paul Tackley wrote:

> Hello,
> 
> PETSc was working fine on my M1 Mac until I upgraded to Xcode 15.0 - now I 
> can’t even configure it. There seems to be a problem related to C and C++ in 
> Xcode 15.0. “Cxx libraries cannot directly be used with C as linker”. Stdout 
> message pasted below and configure.log file attached.
> 
> Thanks for any advice.
> Paul
> 
> 
> % gcc --version
> 
> Apple clang version 15.0.0 (clang-1500.0.40.1)
> 
> Target: arm64-apple-darwin22.6.0
> 
> 
> % g++ --version
> 
> Apple clang version 15.0.0 (clang-1500.0.40.1)
> 
> Target: arm64-apple-darwin22.6.0
> 
> 
> 
> *
> 
>UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for 
> details):
> 
> -
> 
>   Cxx libraries cannot directly be used with C as linker.
> 
>   If you don't need the C++ compiler to build external packages or for you 
> application you
> 
>   can run
> 
>   ./configure with --with-cxx=0. Otherwise you need a different combination 
> of C and C++
> 
>   compilers
> 
> *
> 
>   File "/Users/pjt/Software/PETSc/petsc-3.19.5/config/configure.py", line 
> 462, in petsc_configure
> 
> framework.configure(out = sys.stdout)
> 
>   File 
> "/Users/pjt/Software/PETSc/petsc-3.19.5/config/BuildSystem/config/framework.py",
>  line 1438, in configure
> 
> self.processChildren()
> 
>   File 
> "/Users/pjt/Software/PETSc/petsc-3.19.5/config/BuildSystem/config/framework.py",
>  line 1426, in processChildren
> 
> self.serialEvaluation(self.childGraph)
> 
>   File 
> "/Users/pjt/Software/PETSc/petsc-3.19.5/config/BuildSystem/config/framework.py",
>  line 1401, in serialEvaluation
> 
> child.configure()
> 
>   File 
> "/Users/pjt/Software/PETSc/petsc-3.19.5/config/BuildSystem/config/compilers.py",
>  line 1463, in configure
> 
> self.executeTest(self.checkCxxLibraries)
> 
>   File 
> "/Users/pjt/Software/PETSc/petsc-3.19.5/config/BuildSystem/config/base.py", 
> line 138, in executeTest
> 
> ret = test(*args,**kargs)
> 
>   ^^^
> 
>   File 
> "/Users/pjt/Software/PETSc/petsc-3.19.5/config/BuildSystem/config/compilers.py",
>  line 695, in checkCxxLibraries
> 
> raise RuntimeError("Cxx libraries cannot directly be used with C as 
> linker.\n\
> 
> 


Re: [petsc-users] I cannot find my libpetsc.a in my installation pets-3.4.5

2023-09-25 Thread Satish Balay via petsc-users
On Tue, 26 Sep 2023, Ivan Luthfi wrote:

> Sorry, In the petsc_lib i only found out the libpetsc.so, but not
> libpetsc.a,
> 
> are they functionality the same ?

yes.

You would run 'make check' after the build - to verify if PETSc examples are 
able to compile, link correctly [to this libpetsc.so] - and run

And then - you should be able to use it similarly with your application code.

If there are issues with building your application code - compare the compile 
command used for the petsc examples  with the compile command used for your 
application code


Note: configure option --with-shared-libraries=1 [default] builds petsc shared 
library - i.e libpetsc.so. And if you use --with-shared-libraries=0 - it will 
build petsc as static library - i.e libpetsc.a

A compiler (linker)  will look for shared library and use it if found. If not 
found - it will look for a static version of the library - and use it if found.

Satish

> 
> thank you for your recommendation, i will try as your recommendation step
> by step, because the code uses some deprecated function that I have to
> changed to be compiled in the latest version
> 
> On Tue, Sep 26, 2023 at 8:44 AM Satish Balay  wrote:
> 
> > We generally recommend getting a basic build going for such use case -
> > and then migrate it to latest version as old versions have more issues
> > [as you might be encountering now]
> >
> > Also - you haven't responded to my follow-up regarding the issue you
> > are encountering.
> >
> > If you are having build issues - you might want to recheck the
> > installation instructions (for petsc-3.4) - and send us relevant logs.
> >
> > Satish
> >
> > On Tue, 26 Sep 2023, Ivan Luthfi wrote:
> >
> > > the reason I install the old version is because i try to run an old code
> > > file that was compiled with the same version
> > >
> > > On Tue, Sep 26, 2023 at 4:20 AM Satish Balay  wrote:
> > >
> > > >
> > > > What are you looking at? Send 'ls' from PETSC_DIR/PETSC_ARCH/lib
> > > >
> > > > Perhaps its a shared library build - and you have libpetsc.so?
> > > >
> > > > BTW: the current release is 3.19 - and you are attempting to build a
> > super
> > > > old version 3.4.
> > > >
> > > > We recommend using the latest version to avoid build and other
> > > > compatibility issues.
> > > >
> > > > Satish
> > > >
> > > > On Tue, 26 Sep 2023, Ivan Luthfi wrote:
> > > >
> > > > > Dear developers,
> > > > > I cannot find libpetsc.a in my library.
> > > > > Do you have any suggestion how can I figure this out.
> > > > >
> > > >
> > > >
> > >
> >
> >
> 



Re: [petsc-users] I cannot find my libpetsc.a in my installation pets-3.4.5

2023-09-25 Thread Satish Balay via petsc-users
We generally recommend getting a basic build going for such use case -
and then migrate it to latest version as old versions have more issues
[as you might be encountering now]

Also - you haven't responded to my follow-up regarding the issue you
are encountering.

If you are having build issues - you might want to recheck the
installation instructions (for petsc-3.4) - and send us relevant logs.

Satish

On Tue, 26 Sep 2023, Ivan Luthfi wrote:

> the reason I install the old version is because i try to run an old code
> file that was compiled with the same version
> 
> On Tue, Sep 26, 2023 at 4:20 AM Satish Balay  wrote:
> 
> >
> > What are you looking at? Send 'ls' from PETSC_DIR/PETSC_ARCH/lib
> >
> > Perhaps its a shared library build - and you have libpetsc.so?
> >
> > BTW: the current release is 3.19 - and you are attempting to build a super
> > old version 3.4.
> >
> > We recommend using the latest version to avoid build and other
> > compatibility issues.
> >
> > Satish
> >
> > On Tue, 26 Sep 2023, Ivan Luthfi wrote:
> >
> > > Dear developers,
> > > I cannot find libpetsc.a in my library.
> > > Do you have any suggestion how can I figure this out.
> > >
> >
> >
> 



Re: [petsc-users] I cannot find my libpetsc.a in my installation pets-3.4.5

2023-09-25 Thread Satish Balay via petsc-users


What are you looking at? Send 'ls' from PETSC_DIR/PETSC_ARCH/lib

Perhaps its a shared library build - and you have libpetsc.so?

BTW: the current release is 3.19 - and you are attempting to build a super old 
version 3.4.

We recommend using the latest version to avoid build and other compatibility 
issues.

Satish

On Tue, 26 Sep 2023, Ivan Luthfi wrote:

> Dear developers,
> I cannot find libpetsc.a in my library.
> Do you have any suggestion how can I figure this out. 
> 



Re: [petsc-users] PETSc with Xcode 15

2023-09-21 Thread Satish Balay via petsc-users
Do you get this failure  with petsc main branch as well?

Satish

On Thu, 21 Sep 2023, Blaise Bourdin wrote:

> FWIW, CLT 15.0 also seems to include changes to the linker, with incompatible 
> options etc… I was able to rebuild mpich and petsc but I get many linker 
> warnings and have not fully tested my build
> 
> Before CLT 15.0 update
> 
> SiMini:mef90-dmplex (dmplex)$ ld -v
> 
> @(#)PROGRAM:ld  PROJECT:ld64-857.1
> 
> BUILD 23:13:29 May  7 2023
> 
> configured to support archs: armv6 armv7 armv7s arm64 arm64e arm64_32 i386 
> x86_64 x86_64h armv6m armv7k armv7m armv7em
> 
> LTO support using: LLVM version 14.0.3, (clang-1403.0.22.14.1) (static 
> support for 29, runtime is 29)
> 
> TAPI support using: Apple TAPI version 14.0.3 (tapi-1403.0.5.1)
> 
> 
> vs after:
> 
> bblaptop:mef90-dmplex (dmplex)$ ld -v
> 
> @(#)PROGRAM:ld  PROJECT:dyld-1015.7
> 
> BUILD 18:48:48 Aug 22 2023
> 
> configured to support archs: armv6 armv7 armv7s arm64 arm64e arm64_32 i386 
> x86_64 x86_64h armv6m armv7k armv7m armv7em
> 
> will use ld-classic for: armv6 armv7 armv7s arm64_32 i386 armv6m armv7k 
> armv7m armv7em
> 
> LTO support using: LLVM version 15.0.0 (static support for 29, runtime is 29)
> 
> TAPI support using: Apple TAPI version 15.0.0 (tapi-1500.0.12.3)
> 
> Library search paths:
> 
> Framework search paths:
> 
> 
> 
> bblaptop:tutorials (main)$ pwd
> 
> /opt/HPC/petsc-main/src/vec/vec/tutorials
> 
> bblaptop:tutorials (main)$ make ex1
> 
> mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
> -Wl,suppress -Wl,-search_paths_first -Wl,-no_compact_unwind 
> -Wimplicit-function-declaration -Wunused -Wuninitialized -fPIC -g3 -O0 
> -I/opt/HPC/petsc-main/include 
> -I/opt/HPC/petsc-main/ventura-gcc13.2-arm64-g/include -I/opt/X11/include      
> ex1.c  -Wl,-rpath,/opt/HPC/petsc-main/ventura-gcc13.2-arm64-g/lib 
> -L/opt/HPC/petsc-main/ventura-gcc13.2-arm64-g/lib
> -Wl,-rpath,/opt/HPC/petsc-main/ventura-gcc13.2-arm64-g/lib 
> -L/opt/HPC/petsc-main/ventura-gcc13.2-arm64-g/lib -Wl,-rpath,/opt/X11/lib 
> -L/opt/X11/lib -Wl,-rpath,/opt/homebrew/Cellar/mpich/4.1.2/lib 
> -L/opt/homebrew/Cellar/mpich/4.1.2/lib
> -Wl,-rpath,/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current/gcc/aarch64-apple-darwin22/13
>  
> -L/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current/gcc/aarch64-apple-darwin22/13
>  -Wl,-rpath,/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current/gcc
> -L/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current/gcc 
> -Wl,-rpath,/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current 
> -L/opt/homebrew/Cellar/gcc/13.2.0/lib/gcc/current -lpetsc -lHYPRE -ldmumps 
> -lmumps_common -lpord -lpthread -lscalapack
> -lsuperlu -lml -llapack -lblas -lparmetis -lmetis -lexoIIv2for32 -lexodus 
> -lnetcdf -lpnetcdf -lhdf5_hl -lhdf5 -lchaco -ltriangle -lz -lctetgen -lX11 
> -lmpifort -lmpi -lpmpi -lgfortran -lemutls_w -lquadmath -lc++ -o ex1
> 
> ld: warning: -multiply_defined is obsolete
> 
> ld: warning: -multiply_defined is obsolete
> 
> ld: warning: duplicate -rpath 
> '/opt/HPC/petsc-main/ventura-gcc13.2-arm64-g/lib' ignored
> 
> ld: warning: -bind_at_load is deprecated on macOS
> 
> ld: warning: ignoring duplicate libraries: '-lmpi', '-lpmpi'
> 
> 
> 
> 
> That’s quite a curveball a week ahead of a major software update.
> 
> Blaise
> 
>   On Sep 20, 2023, at 10:43 PM, Ju Liu  wrote:
> 
> Caution: External email.
> 
> Hi PETSc team:
> I recently got my Xcode command line tools upgraded to version 15. When 
> installing PETSc, the configure command returns an error. My configure 
> command is 
> 
> ./configure --with-cc=gcc --with-fc=0 --with-cxx=0 --download-f2cblaslapack 
> --download-mpich
> 
> and the message is "Cannot locate all the standard C header files needed by 
> PETSc".
> 
> The configure.log file is attached.
> 
> How shall I fix this? 
> 
> Thanks,
> 
> Ju
> 
> 
> 
> — 
> Canada Research Chair in Mathematical and Computational Aspects of Solid 
> Mechanics (Tier 1)
> Professor, Department of Mathematics & Statistics
> Hamilton Hall room 409A, McMaster University
> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243
> 
> 
> 

Re: [petsc-users] Problem with BLASdot in 3.19.4

2023-09-19 Thread Satish Balay via petsc-users
Its a run time option to petsc (application) binary.

So you can either specify it via command line - at run time - or add it to env 
variable "PETSC_OPTIONS" - or add it to $HOME/.petscrc file

Satish


On Tue, 19 Sep 2023, Thuc Bui wrote:

> Hi Barry,
> 
>  
> 
> Thanks for getting back to me. The diagnostics were generated when tracing 
> under the VS debugger. To use the option –no_signal_handler, I believe I will 
> need to reconfigure PETSc with this additional option. I will try it now.
> 
>  
> 
> Thuc
> 
>  
> 
>  
> 
> From: Barry Smith [mailto:bsm...@petsc.dev] 
> Sent: Tuesday, September 19, 2023 8:24 AM
> To: Thuc Bui
> Cc: PETSc users list
> Subject: Re: [petsc-users] Problem with BLASdot in 3.19.4
> 
>  
> 
>  
> 
>   Can you run in the Microsoft Visual Studio debugger? Use the additional 
> PETSc option -no_signal_handler
> 
>  
> 
>  
> 
>   It won't show exactly where the SEGV happens but might focus in a bit on 
> it. For example it may be ddot() or ddot_().
> 
>  
> 
>   Barry
> 
>  
> 
>  
> 
> 
> 
> 
> 
> On Sep 19, 2023, at 2:04 AM, Thuc Bui  wrote:
> 
>  
> 
> Hi Barry,
> 
>  
> 
> Visual Studio 2022 is the problem! The code linked to Petsc 3.18.6 built with 
> VS 2022 also crashes at the same place. The same errors are shown below. I 
> don’t remember for sure, but I don’t think I was able to configure Petsc 
> 3.19.4 with VS 2019. However, I will still try that next.
> 
>  
> 
> Thanks for your help,
> 
> Thuc
> 
>  
> 
> [0]PETSC ERROR: 
> 
> 
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> probably memory access out of range
> 
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> 
> [0]PETSC ERROR: or see   
> https://petsc.org/release/faq/#valgrind and   
> https://petsc.org/release/faq/
> 
> [0]PETSC ERROR: -  Stack Frames 
> 
> 
> [0]PETSC ERROR: The line numbers in the error traceback are not always exact.
> 
> [0]PETSC ERROR: #1 BLASdot()
> 
> [0]PETSC ERROR: #2 VecNorm_Seq() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\impls\seq\bvec2.c:216
> 
> [0]PETSC ERROR: #3 VecNorm() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\interface\rvector.c:237
> 
> [0]PETSC ERROR: #4 VecNormalize() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\interface\rvector.c:318
> 
> [0]PETSC ERROR: #5 KSPGMRESCycle() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\impls\gmres\gmres.c:111
> 
> [0]PETSC ERROR: #6 KSPSolve_GMRES() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\impls\gmres\gmres.c:228
> 
> [0]PETSC ERROR: #7 KSPSolve_Private() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\interface\itfunc.c:899
> 
> [0]PETSC ERROR: #8 KSPSolve() at 
> D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\interface\itfunc.c:1071
> 
>  
> 
> job aborted:
> 
> [ranks] message
> 
>  
> 
> [0] application aborted
> 
> aborting MPI_COMM_WORLD (comm=0x4400), error 59, comm rank 0
> 
>  
> 
>  
> 
>  
> 
> From: petsc-users [  
> mailto:petsc-users-boun...@mcs.anl.gov] On Behalf Of Thuc Bui
> Sent: Monday, September 18, 2023 4:24 PM
> To: 'Barry Smith'
> Cc: 'PETSc users list'
> Subject: Re: [petsc-users] Problem with BLASdot in 3.19.4
> 
>  
> 
> Thanks a lot Barry, for getting back to me. Will do what you have suggested, 
> and get back with the results.
> 
>  
> 
> Best regards,
> 
> Thuc
> 
>  
> 
> From: Barry Smith [  mailto:bsm...@petsc.dev] 
> Sent: Monday, September 18, 2023 3:43 PM
> To: Thuc Bui
> Cc: PETSc users list
> Subject: Re: [petsc-users] Problem with BLASdot in 3.19.4
> 
>  
> 
>  
> 
>   Ok, two things are being changed at the same time: the version  of PETSc 
> and the version of Visual Studio. 
> 
>  
> 
>   Could you please try with the new Visual Studio version but the same older 
> PETSc version? If that works could you try with the old Visual Studio version 
> but the new PETSc version? 
> 
>  
> 
>   Barry
> 
>  
> 
>  
> 
>  
> 
> On Sep 18, 2023, at 6:26 PM, Thuc Bui <  
> b...@calcreek.com> wrote:
> 
>  
> 
> Dear Petsc users and experts,
> 
>  
> 
> If someone can direct me how to track this bug, I would really appreciate it.
> 
>  
> 
> The Petsc DLL library version 3.19.4 was built on Windows 10 with Visual 
> Studio 2022, and with Microsoft MPI 10.1.2 and Intel MKL 2020.3.279. The same 
> code works fine with Petsc 3.18.6 using the same versions of MS MPI and Intel 
> MKL, and built with Visual Studio 2019.
> 
>  
> 
> When my code calls PetscCall(KSPSolve(...)), it gets to 
> PetscCall(KSPGMRESCycle(...)), PetscCall(VecNormalize(VEC_VV(0), )), 
> PetscCall(VecNorm(x, NORM_2, )) and then crashes in VecNorm_Seq() 

Re: [petsc-users] Problem with BLASdot in 3.19.4

2023-09-19 Thread Satish Balay via petsc-users
BTW: Can check if you are using threaded MKL?

We default to:

  Libraries:  -L/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/2022.1.0/lib/intel64 
mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib

If using threaded MKL - try using env variable "OMP_NUM_THREADS=1" and see that 
makes a difference

Satish

On Tue, 19 Sep 2023, Barry Smith wrote:

> 
>   Can you run in the Microsoft Visual Studio debugger? Use the additional 
> PETSc option -no_signal_handler
> 
> 
>   It won't show exactly where the SEGV happens but might focus in a bit on 
> it. For example it may be ddot() or ddot_().
> 
>   Barry
> 
> 
> 
> > On Sep 19, 2023, at 2:04 AM, Thuc Bui  wrote:
> > 
> > Hi Barry,
> >  
> > Visual Studio 2022 is the problem! The code linked to Petsc 3.18.6 built 
> > with VS 2022 also crashes at the same place. The same errors are shown 
> > below. I don’t remember for sure, but I don’t think I was able to configure 
> > Petsc 3.19.4 with VS 2019. However, I will still try that next.
> >  
> > Thanks for your help,
> > Thuc
> >  
> > [0]PETSC ERROR: 
> > 
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> > probably memory access out of range
> > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> > [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and 
> > https://petsc.org/release/faq/
> > [0]PETSC ERROR: -  Stack Frames 
> > 
> > [0]PETSC ERROR: The line numbers in the error traceback are not always 
> > exact.
> > [0]PETSC ERROR: #1 BLASdot()
> > [0]PETSC ERROR: #2 VecNorm_Seq() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\impls\seq\bvec2.c:216
> > [0]PETSC ERROR: #3 VecNorm() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\interface\rvector.c:237
> > [0]PETSC ERROR: #4 VecNormalize() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\interface\rvector.c:318
> > [0]PETSC ERROR: #5 KSPGMRESCycle() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\impls\gmres\gmres.c:111
> > [0]PETSC ERROR: #6 KSPSolve_GMRES() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\impls\gmres\gmres.c:228
> > [0]PETSC ERROR: #7 KSPSolve_Private() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\interface\itfunc.c:899
> > [0]PETSC ERROR: #8 KSPSolve() at 
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\interface\itfunc.c:1071
> >  
> > job aborted:
> > [ranks] message
> >  
> > [0] application aborted
> > aborting MPI_COMM_WORLD (comm=0x4400), error 59, comm rank 0
> >  
> >  
> >  
> > From: petsc-users [mailto:petsc-users-boun...@mcs.anl.gov] On Behalf Of 
> > Thuc Bui
> > Sent: Monday, September 18, 2023 4:24 PM
> > To: 'Barry Smith'
> > Cc: 'PETSc users list'
> > Subject: Re: [petsc-users] Problem with BLASdot in 3.19.4
> >  
> > Thanks a lot Barry, for getting back to me. Will do what you have 
> > suggested, and get back with the results.
> >  
> > Best regards,
> > Thuc
> >  
> > From: Barry Smith [mailto:bsm...@petsc.dev] 
> > Sent: Monday, September 18, 2023 3:43 PM
> > To: Thuc Bui
> > Cc: PETSc users list
> > Subject: Re: [petsc-users] Problem with BLASdot in 3.19.4
> >  
> >  
> >   Ok, two things are being changed at the same time: the version  of PETSc 
> > and the version of Visual Studio. 
> >  
> >   Could you please try with the new Visual Studio version but the same 
> > older PETSc version? If that works could you try with the old Visual Studio 
> > version but the new PETSc version? 
> >  
> >   Barry
> >  
> >  
> >  
> > 
> > On Sep 18, 2023, at 6:26 PM, Thuc Bui  > > wrote:
> >  
> > Dear Petsc users and experts,
> >  
> > If someone can direct me how to track this bug, I would really appreciate 
> > it.
> >  
> > The Petsc DLL library version 3.19.4 was built on Windows 10 with Visual 
> > Studio 2022, and with Microsoft MPI 10.1.2 and Intel MKL 2020.3.279. The 
> > same code works fine with Petsc 3.18.6 using the same versions of MS MPI 
> > and Intel MKL, and built with Visual Studio 2019.
> >  
> > When my code calls PetscCall(KSPSolve(...)), it gets to 
> > PetscCall(KSPGMRESCycle(...)), PetscCall(VecNormalize(VEC_VV(0), )), 
> > PetscCall(VecNorm(x, NORM_2, )) and then crashes in VecNorm_Seq() at:
> >  
> > PetscCallBLAS("BLASdot", ztmp[type == NORM_1_AND_2] = 
> > PetscSqrtReal(PetscRealPart(BLASdot_(, xx, , xx, ;
> >  
> > I tried to step into BLASdot_, but was unable to. I assume BLASdot_ belong 
> > to Intel MKL, and its library does not contain debug information. After the 
> > code exits, the errors are shown below the hash line.
> >  
> > Should I install the latest Intel MKL to work with Petsc 3.19.4?
> >  
> > Many thanks in advance for your help,
> > Thuc Bui
> > Senior R Engineer
> > Calabazas Creek Research, Inc
> > 

Re: [petsc-users] Problem with BLASdot in 3.19.4

2023-09-19 Thread Satish Balay via petsc-users
On Tue, 19 Sep 2023, Matthew Knepley wrote:

> On Tue, Sep 19, 2023 at 7:04 AM Thuc Bui  wrote:
> 
> > Hi Barry,
> >
> >
> >
> > Visual Studio 2022 is the problem! The code linked to Petsc 3.18.6 built
> > with VS 2022 also crashes at the same place. The same errors are shown
> > below. I don’t remember for sure, but I don’t think I was able to configure
> > Petsc 3.19.4 with VS 2019. However, I will still try that next.
> >
> 
> It is so easy to hate VS. This suggests that VS is secretly linking to
> another BLAS, perhaps with different calling semantics.

There could be multiple issues:

- code bugs - that show up in different settings
- Intel Compilers/OneAPI are have tight dependency on native compilers [on 
linux and windows]
So perhaps the current install is over VS19 - and not VS22. If this is the case 
- a reinstall [of Intel Compilers] over VS22 might help?
Or there could be different compiler dlls in VS vs Intel installs - and one of 
them is misbehaving.. [so reorder PATH location for Intel/MS?]
- MLK bugs that show up in some setups? [here using alternate blas - aka 
fblaslapack might help?]
- Latest VS with latest OneAPI might help? [however our test setup is with 
older versions of these compilers..

>>>
Compilers:
  C Compiler: 
/cygdrive/e/svcpetsc/builds/Cw-cvdV3/0/petsc/petsc/lib/petsc/bin/win32fe/win_cl 
 -GF -MD -wd4996 -Zc:preprocessor  -O2 
Version: Win32 Development Tool Front End, version 1.11.4 Fri, Sep 10, 2021 
 6:33:40 PM\nMicrosoft (R) C/C++ Optimizing Compiler Version 19.32.31329 for x64
  C++ Compiler: 
/cygdrive/e/svcpetsc/builds/Cw-cvdV3/0/petsc/petsc/lib/petsc/bin/win32fe/win_cl 
 -GF -MD -GR -EHsc -Zc:preprocessor  -Zc:__cplusplus -O2 -Zm200  -std:c++20 -TP 
Version: Win32 Development Tool Front End, version 1.11.4 Fri, Sep 10, 2021 
 6:33:40 PM\nMicrosoft (R) C/C++ Optimizing Compiler Version 19.32.31329 for x64
  Fortran Compiler: 
/cygdrive/e/svcpetsc/builds/Cw-cvdV3/0/petsc/petsc/lib/petsc/bin/win32fe/win_ifort
  -MD -O3 -fpp 
Version: Win32 Development Tool Front End, version 1.11.4 Fri, Sep 10, 2021 
 6:33:40 PM\nIntel(R) Fortran Intel(R) 64 Compiler Classic for applications 
running on Intel(R) 64, Version 2021.6.0 Build 20220226_00
<<<

Satish

> 
>   Thanks,
> 
>  Matt
> 
> 
> > Thanks for your help,
> >
> > Thuc
> >
> >
> >
> > [0]PETSC ERROR:
> > 
> >
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > probably memory access out of range
> >
> > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> >
> > [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and
> > https://petsc.org/release/faq/
> >
> > [0]PETSC ERROR: -  Stack Frames
> > 
> >
> > [0]PETSC ERROR: The line numbers in the error traceback are not always
> > exact.
> >
> > [0]PETSC ERROR: #1 BLASdot()
> >
> > [0]PETSC ERROR: #2 VecNorm_Seq() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\impls\seq\bvec2.c:216
> >
> > [0]PETSC ERROR: #3 VecNorm() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\interface\rvector.c:237
> >
> > [0]PETSC ERROR: #4 VecNormalize() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\vec\vec\interface\rvector.c:318
> >
> > [0]PETSC ERROR: #5 KSPGMRESCycle() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\impls\gmres\gmres.c:111
> >
> > [0]PETSC ERROR: #6 KSPSolve_GMRES() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\impls\gmres\gmres.c:228
> >
> > [0]PETSC ERROR: #7 KSPSolve_Private() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\interface\itfunc.c:899
> >
> > [0]PETSC ERROR: #8 KSPSolve() at
> > D:\Users\bbwannabe\Documents\Petsc\petsc-3.18.6\src\ksp\ksp\interface\itfunc.c:1071
> >
> >
> >
> > job aborted:
> >
> > [ranks] message
> >
> >
> >
> > [0] application aborted
> >
> > aborting MPI_COMM_WORLD (comm=0x4400), error 59, comm rank 0
> >
> >
> >
> >
> >
> >
> >
> > *From:* petsc-users [mailto:petsc-users-boun...@mcs.anl.gov] *On Behalf
> > Of *Thuc Bui
> > *Sent:* Monday, September 18, 2023 4:24 PM
> > *To:* 'Barry Smith'
> > *Cc:* 'PETSc users list'
> > *Subject:* Re: [petsc-users] Problem with BLASdot in 3.19.4
> >
> >
> >
> > Thanks a lot Barry, for getting back to me. Will do what you have
> > suggested, and get back with the results.
> >
> >
> >
> > Best regards,
> >
> > Thuc
> >
> >
> >
> > *From:* Barry Smith [mailto:bsm...@petsc.dev]
> > *Sent:* Monday, September 18, 2023 3:43 PM
> > *To:* Thuc Bui
> > *Cc:* PETSc users list
> > *Subject:* Re: [petsc-users] Problem with BLASdot in 3.19.4
> >
> >
> >
> >
> >
> >   Ok, two things are being changed at the same time: the version  of PETSc
> > and the version of Visual Studio.
> >
> >
> >
> >   Could you please try with the new Visual Studio version but the same
> > older PETSc 

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-06 Thread Satish Balay via petsc-users
BTW: Stepping back and looking that the error message:

> > > >> Error: The import statement 'import matlab.internal.engine.input'
> > cannot be found or cannot be imported. Imported names must end with '.*' or
> > be fully qualified.

Google suggests:

https://www.mathworks.com/matlabcentral/answers/494387-python-engineerror-import-argument-matlab-internal-engine-input-cannot-be-found-or-cannot-be-impo

i.e a wrong "input.m" file is loaded - that's causing grief?

But its not clear to me how one would track this file. MATLABPATH? some default 
matlab config files?

And is this conflicting file coming from matlab install or $HOME?

Or the conflict is due to a different ".m" file - then input.m?

Assuming this is indeed the issue (a- I would suggest trying this out in
a fresh user account - and see if things work [both basic matlab only
test - and the petsc+matlab]

Satish

On Thu, 7 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:

> I tried with different examples and all are working later.
> As you said there may be a license issue.
> I will try with latest matlab version.
> 
> Thanks and regards
> Srinivas
> 
> On Thu, Sep 7, 2023, 00:24 Satish Balay  wrote:
> 
> > I'm still curios why the first [c++] attempt failed but the second one
> > succeeded.
> >
> > The only difference I can spot is myEngineApp.cpp vs myEngineApp2.cpp.
> > What is the difference here between these 2 source files?
> >
> > Or is it some random error [the same source code worked failed before -
> > but works now - and might fail again later?]
> >
> > Maybe there is some issue with matlab license here - that prevents some
> > usages (or usage patterns) to fail?
> >
> > Satish
> >
> > On Wed, 6 Sep 2023, Barry Smith wrote:
> >
> > >
> > >Ok, so the C++ binding for Matlab engine is working on your machine.
> > PETSc uses the C binding which links against different libraries but, one
> > would think the C binding would work for sure if the C++ binding works. Can
> > you try a Matlab Engine C example provided by Matlab?
> > >
> > >
> > >
> > >
> > > > On Sep 6, 2023, at 3:58 AM, INTURU SRINIVAS 20PHD0548 <
> > inturu.srinivas2...@vitstudent.ac.in> wrote:
> > > >
> > > > Hi Amneet,
> > > >
> > > > I repeated the same procedure again and this time it is working.
> > > > $g++ -std=c++11 -I /usr/local/MATLAB/R2020b/extern/include/ -L
> > /usr/local/MATLAB/R2020b/extern/bin/glnxa64/ -pthread myEngineApp2.cpp
> > -lMatlabDataArray -lMatlabEngine
> > > >
> > $LD_LIBRARY_PATH=/usr/local/MATLAB/R2020b/extern/bin/glnxa64:/usr/local/MATLAB/R2020b/sys/os/glnxa64
> > > > $export LD_LIBRARY_PATH
> > > > $./a.out
> > > > Square root of -2 is 0 + 1.41421i
> > > > Square root of 2 is 1.41421 + 0i
> > > > Square root of 6 is 2.44949 + 0i
> > > > Square root of 8 is 2.82843 + 0i
> > > > I don't know how it is working this time?
> > > >
> > > > Thanks and regards
> > > > Srinivas
> > > >
> > > >
> > > > On Wed, Sep 6, 2023 at 12:17 PM INTURU SRINIVAS 20PHD0548 <
> > inturu.srinivas2...@vitstudent.ac.in  > inturu.srinivas2...@vitstudent.ac.in>> wrote:
> > > >> Hi Amneet,
> > > >> I tried the following example to run a matlab engine.
> > > >>
> > > >> #include "MatlabDataArray.hpp"
> > > >> #include "MatlabEngine.hpp"
> > > >> #include 
> > > >> void callSQRT() {
> > > >>
> > > >> using namespace matlab::engine;
> > > >>
> > > >> // Start MATLAB engine synchronously
> > > >> std::unique_ptr matlabPtr = startMATLAB();
> > > >>
> > > >> //Create MATLAB data array factory
> > > >> matlab::data::ArrayFactory factory;
> > > >>
> > > >> // Define a four-element typed array
> > > >> matlab::data::TypedArray const argArray =
> > > >> factory.createArray({ 1,4 }, { -2.0, 2.0, 6.0, 8.0 });
> > > >>
> > > >> // Call MATLAB sqrt function on the data array
> > > >> matlab::data::Array const results = matlabPtr->feval(u"sqrt",
> > argArray);
> > > >>
> > > >> // Display results
> > > >> for (int i = 0; i < results.getNumberOfElements(); i++) {
> > > >> double a = argArray[i];
> > > >> std::complex v = results[i];
> > > >> double realPart = v.real();
> > > >> double imgPart = v.imag();
> > > >> std::cout << "Square root of " << a << " is " <<
> > > >> realPart << " + " << imgPart << "i" << std::endl;
> > > >> }
> > > >> }
> > > >>
> > > >> int main() {
> > > >> callSQRT();
> > > >> return 0;
> > > >> }
> > > >>
> > > >> $g++ -std=c++11 -I /usr/local/MATLAB/R2020b/extern/include/ -L
> > /usr/local/MATLAB/R2020b/extern/bin/glnxa64/ -pthread myEngineApp.cpp
> > -lMatlabDataArray -lMatlabEngine
> > > >>
> > $LD_LIBRARY_PATH=/usr/local/MATLAB/R2020b/extern/bin/glnxa64:/usr/local/MATLAB/R2020b/sys/os/glnxa64
> > > >> $export LD_LIBRARY_PATH
> > > >> $./a.out
> > > >> Starting
> > > >> Error: The import statement 'import matlab.internal.engine.input'
> > cannot be found or cannot be imported. Imported names must end with '.*' or
> > be fully 

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-03 Thread Satish Balay via petsc-users
>>> shooting.
> > > >>> # [0]PETSC ERROR: Petsc Release Version 3.13.4, Aug 01, 2020
> > > >>> # [0]PETSC ERROR: ../matlab_ls_test on a linux-opt named
> > MB108SMEC028 by
> > > >>> vit Sun Sep  3 11:51:05 2023
> > > >>> # [0]PETSC ERROR: Configure options
> > > >>> --with-mpi-dir=/home/vit/sfw/linux/openmpi/4.1.4 COPTFLAGS=-O3
> > > >>> CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 PETSC_ARCH=linux-opt --with-debugging=0
> > > >>> --with-x=0 --with-matlab-dir=/usr/local/MATLAB/R2020b
> > > >>> --with-blaslapack-dir=/usr/local/MATLAB/R2020b
> > > >>> --known-64-bit-blas-indices=1 --with-matlab-engine=1
> > > >>> # [0]PETSC ERROR: #1 PetscMatlabEngineCreate() line 67 in
> > > >>> /home/vit/sfw/petsc/3.13.4/src/sys/classes/matlabengine/matlab.c
> > > >>> # [0]PETSC ERROR: #2 main() line 126 in
> > > >>>
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c
> > > >>> # [0]PETSC ERROR: PETSc Option Table entries:
> > > >>> # [0]PETSC ERROR: -prob_id 5
> > > >>> # [0]PETSC ERROR: -tao_smonitor
> > > >>> # [0]PETSC ERROR: End of Error Message ---send
> > > >>> entire error message to petsc-ma...@mcs.anl.gov--
> > > >>> #
> > > >>>
> > --
> > > >>> # MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
> > > >>> # with errorcode 126076.
> > > >>> #
> > > >>> # NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > > >>> # You may or may not see output from other processes, depending on
> > > >>> # exactly when Open MPI kills them.
> > > >>> #
> > > >>>
> > --
> > > >>>  ok tao_leastsquares_tutorials_matlab-matlab_ls_test # SKIP Command
> > > >>> failed so no diff
> > > >>>
> > > >>> # -
> > > >>> #   Summary
> > > >>> # -
> > > >>> # FAILED ksp_ksp_tutorials-ex72_12
> > > >>> tao_leastsquares_tutorials_matlab-matlab_ls_test
> > > >>> # success 0/2 tests (0.0%)
> > > >>> # failed 2/2 tests (100.0%)
> > > >>> # todo 0/2 tests (0.0%)
> > > >>> # skip 0/2 tests (0.0%)
> > > >>> #
> > > >>> # Wall clock time for tests: 3 sec
> > > >>> # Approximate CPU time (not incl. build time): 0.05 sec
> > > >>> #
> > > >>> # To rerun failed tests:
> > > >>> # /usr/bin/make -f gmakefile test test-fail=1
> > > >>> #
> > > >>> # Timing summary (actual test time / total CPU time):
> > > >>> #   ksp_ksp_tutorials-ex72_12: 0.02 sec / 0.03 sec
> > > >>> #   tao_leastsquares_tutorials_matlab-matlab_ls_test: 0.02 sec /
> > 0.02 sec
> > > >>>
> > > >>> *How to sort out this error?*
> > > >>>
> > > >>> *$cd src/tao/leastsquares/tutorials/matlab/*
> > > >>> *$make matlab_ls_test*
> > > >>> /home/vit/sfw/linux/openmpi/4.1.4/bin/mpicc -fPIC -Wall
> > -Wwrite-strings
> > > >>> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector
> > > >>> -fvisibility=hidden -O3 -fPIC -Wall -Wwrite-strings
> > -Wno-strict-aliasing
> > > >>> -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O3
> > > >>>  -I/home/vit/sfw/petsc/3.13.4/include
> > > >>> -I/home/vit/sfw/petsc/3.13.4/linux-opt/include
> > > >>> -I/usr/local/MATLAB/R2020b/extern/include
> > > >>> -I/home/vit/sfw/linux/openmpi/4.1.4/include  matlab_ls_test.c
> > > >>>  -Wl,-rpath,/home/vit/sfw/petsc/3.13.4/linux-opt/lib
> > > >>> -L/home/vit/sfw/petsc/3.13.4/linux-opt/lib
> > > >>> /usr/local/MATLAB/R2020b/bin/glnxa64/mkl.so
> > > >>> -Wl,-rpath,/usr/local/MATLAB/R2020b/sys/os/glnxa64
> > > >>> -L/usr/local/MATLAB/R2020b/sys/os/glnxa64
> > > >>>
> > -Wl,-rpath,/usr/local/MATLAB/R2020b/sys/os/glnxa64:/usr/local/MATLAB/R2020b/bin/glnxa64:/usr/local/MATLAB/R202

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-03 Thread Satish Balay via petsc-users
---
> >>> # MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
> >>> # with errorcode 126076.
> >>> #
> >>> # NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> >>> # You may or may not see output from other processes, depending on
> >>> # exactly when Open MPI kills them.
> >>> #
> >>> --
> >>>  ok tao_leastsquares_tutorials_matlab-matlab_ls_test # SKIP Command
> >>> failed so no diff
> >>>
> >>> # -
> >>> #   Summary
> >>> # -
> >>> # FAILED ksp_ksp_tutorials-ex72_12
> >>> tao_leastsquares_tutorials_matlab-matlab_ls_test
> >>> # success 0/2 tests (0.0%)
> >>> # failed 2/2 tests (100.0%)
> >>> # todo 0/2 tests (0.0%)
> >>> # skip 0/2 tests (0.0%)
> >>> #
> >>> # Wall clock time for tests: 3 sec
> >>> # Approximate CPU time (not incl. build time): 0.05 sec
> >>> #
> >>> # To rerun failed tests:
> >>> # /usr/bin/make -f gmakefile test test-fail=1
> >>> #
> >>> # Timing summary (actual test time / total CPU time):
> >>> #   ksp_ksp_tutorials-ex72_12: 0.02 sec / 0.03 sec
> >>> #   tao_leastsquares_tutorials_matlab-matlab_ls_test: 0.02 sec / 0.02 sec
> >>>
> >>> *How to sort out this error?*
> >>>
> >>> *$cd src/tao/leastsquares/tutorials/matlab/*
> >>> *$make matlab_ls_test*
> >>> /home/vit/sfw/linux/openmpi/4.1.4/bin/mpicc -fPIC -Wall -Wwrite-strings
> >>> -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector
> >>> -fvisibility=hidden -O3 -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing
> >>> -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O3
> >>>  -I/home/vit/sfw/petsc/3.13.4/include
> >>> -I/home/vit/sfw/petsc/3.13.4/linux-opt/include
> >>> -I/usr/local/MATLAB/R2020b/extern/include
> >>> -I/home/vit/sfw/linux/openmpi/4.1.4/include  matlab_ls_test.c
> >>>  -Wl,-rpath,/home/vit/sfw/petsc/3.13.4/linux-opt/lib
> >>> -L/home/vit/sfw/petsc/3.13.4/linux-opt/lib
> >>> /usr/local/MATLAB/R2020b/bin/glnxa64/mkl.so
> >>> -Wl,-rpath,/usr/local/MATLAB/R2020b/sys/os/glnxa64
> >>> -L/usr/local/MATLAB/R2020b/sys/os/glnxa64
> >>> -Wl,-rpath,/usr/local/MATLAB/R2020b/sys/os/glnxa64:/usr/local/MATLAB/R2020b/bin/glnxa64:/usr/local/MATLAB/R2020b/extern/lib/glnxa64
> >>> -L/usr/local/MATLAB/R2020b/bin/glnxa64
> >>> -L/usr/local/MATLAB/R2020b/extern/lib/glnxa64
> >>> -Wl,-rpath,/home/vit/sfw/linux/openmpi/4.1.4/lib
> >>> -L/home/vit/sfw/linux/openmpi/4.1.4/lib
> >>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9
> >>> -L/usr/lib/gcc/x86_64-linux-gnu/9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> >>> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
> >>> -L/lib/x86_64-linux-gnu -lpetsc -liomp5 -lpthread -lm -leng -lmex -lmx
> >>> -lmat -lstdc++ -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh
> >>> -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread 
> >>> -lquadmath
> >>> -lstdc++ -ldl -o matlab_ls_test
> >>>
> >>> I think there is a problem with the Matlab-R2020b version.
> >>> I am sharing the configure.log and make.log files. Please find the
> >>> attachment and do the needful.
> >>>
> >>>
> >>> Thanks and regards
> >>> Srinivas
> >>>
> >>>
> >>> On Sat, Sep 2, 2023 at 11:15 PM Satish Balay  wrote:
> >>>
> >>>> Perhaps you can try the following to get additional info - and debug
> >>>>
> >>>> Satish
> >>>> --
> >>>>
> >>>> balay@compute-386-07:/scratch/balay/petsc$ cd
> >>>> src/tao/leastsquares/tutorials/matlab/
> >>>> balay@compute-386-07:/scratch/balay/petsc/src/tao/leastsquares/tutorials/matlab$
> >>>> make matlab_ls_test
> >>>> /nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/bin/mpicc -fPIC -Wall
> >>>> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas 
> >>>> -fstack-protector
> >>>> -fvisibility=hidden -O3 -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing
> >>>> -Wno-unknown-pragmas -fstack-protector -fvisibility=hidde

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-03 Thread Satish Balay via petsc-users
 > -I/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/include
> > matlab_ls_test.c  -Wl,-rpath,/scratch/balay/petsc/linux-opt/lib
> > -L/scratch/balay/petsc/linux-opt/lib
> > -Wl,-rpath,/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/sys/os/glnxa64:/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/glnxa64:/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/extern/lib/glnxa64
> > -L/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/glnxa64
> > -L/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/extern/lib/glnxa64
> > -Wl,-rpath,/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/lib
> > -L/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/lib
> > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9
> > -L/usr/lib/gcc/x86_64-linux-gnu/9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu
> > -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
> > -L/lib/x86_64-linux-gnu -lpetsc -llapack -lblas -lpthread -lm -leng -lmex
> > -lmx -lmat -lut -licudata -licui18n -licuuc -lstdc++ -ldl -lmpifort -lmpi
> > -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -o
> > matlab_ls_test
> > balay@compute-386-07:/scratch/balay/petsc/src/tao/leastsquares/tutorials/matlab$
> > LD_PRELOAD=/lib/x86_64-linux-gnu/libgfortran.so.5 ./matlab_ls_test
> > -tao_smonitor -prob_id 5 -info
> > [0] PetscInitialize(): PETSc successfully started: number of processors = 1
> > [0] PetscGetHostName(): Rejecting domainname, likely is NIS
> > compute-386-07.(none)
> > [0] PetscInitialize(): Running on machine: compute-386-07
> > Running problem 5
> > [0] PetscCommDuplicate(): Duplicating a communicator 1140850689
> > -2080374784 max tags = 268435455
> > [0] PetscMatlabEngineCreate(): Starting MATLAB engine with command
> > /nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab
> > -glnxa64 -nodisplay  -nosplash
> > [0] PetscMatlabEngineCreate(): Started MATLAB engine
> > [0] PetscMatlabEngineEvaluate(): Evaluating MATLAB string:
> > TestingInitialize
> > [0] PetscMatlabEngineEvaluate(): Done evaluating Matlab string:
> > TestingInitialize
> > 5
> > 
> >
> > <<<< Now verify if the above matlab command actually works for you - on
> > this machine..>>>>>
> >
> > balay@compute-386-07:/scratch/balay/petsc/src/tao/leastsquares/tutorials/matlab$
> > /nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab
> > -glnxa64 -nodisplay  -nosplash
> >
> >
> >  < M A T L A B (R) >
> >
> >Copyright 1984-2021 The MathWorks, Inc.
> >
> >R2021a (9.10.0.1602886) 64-bit (glnxa64)
> >
> >   February 17, 2021
> >
> >
> > To get started, type doc.
> > For product information, visit www.mathworks.com.
> >
> > >>
> >
> >
> > On Sat, 2 Sep 2023, Satish Balay via petsc-users wrote:
> >
> > > Please don't cc: both petsc-users and petsc-maint [reverting thread to
> > petsc-users only]
> > >
> > > I'm not sure what is happening here. Can you send the corresponding
> > configure.log, make.log [compressed]?
> > >
> > > Here is my attempt to reproduce (with petsc-3.13) on Ubuntu-20.04, with
> > Matlab-R2021a - and that works:
> > >
> > > balay@compute-386-07:/scratch/balay/petsc$ git branch
> > >   main
> > >   release
> > > * release-3.13
> > > balay@compute-386-07:/scratch/balay/petsc$ module load matlab/R2021a
> > > balay@compute-386-07:/scratch/balay/petsc$ which matlab
> > >
> > /nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab
> > > balay@compute-386-07:/scratch/balay/petsc$ ./configure
> > --with-mpi-dir=/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/
> > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 PETSC_ARCH=linux-opt
> > --with-debugging=0 --with-x=0
> > --with-matlab-dir=/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a
> > --with-blaslapack-dir=/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a
> > --known-64-bit-blas-indices=1 --with-matlab-engine=1
> > > 
> > > Compilers:
> >
> >
> >
> > >   C Compiler:
> >  /nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/bin/mpicc  -fPIC -Wall
> > -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector
> > -fvisibility=hidden -O3
> > > Version: gcc (Ubuntu 9.4.0-1ubuntu1~20.04.2) 9.4.0
> > >   C+

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-02 Thread Satish Balay via petsc-users
Perhaps you can try the following to get additional info - and debug

Satish
--

balay@compute-386-07:/scratch/balay/petsc$ cd 
src/tao/leastsquares/tutorials/matlab/
balay@compute-386-07:/scratch/balay/petsc/src/tao/leastsquares/tutorials/matlab$
 make matlab_ls_test
/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/bin/mpicc -fPIC -Wall 
-Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector 
-fvisibility=hidden -O3 -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing 
-Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O3  
-I/scratch/balay/petsc/include -I/scratch/balay/petsc/linux-opt/include 
-I/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/extern/include
 -I/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/include  
matlab_ls_test.c  -Wl,-rpath,/scratch/balay/petsc/linux-opt/lib 
-L/scratch/balay/petsc/linux-opt/lib 
-Wl,-rpath,/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/sys/os/glnxa64:/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/glnxa64:/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/extern/lib/glnxa64
 -L/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/glnxa64 
-L/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R202
 1a/exter
 n/lib/glnxa64 -Wl,-rpath,/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/lib 
-L/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/lib 
-Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 -L/usr/lib/gcc/x86_64-linux-gnu/9 
-Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu 
-Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lpetsc -llapack 
-lblas -lpthread -lm -leng -lmex -lmx -lmat -lut -licudata -licui18n -licuuc 
-lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath 
-lstdc++ -ldl -o matlab_ls_test
balay@compute-386-07:/scratch/balay/petsc/src/tao/leastsquares/tutorials/matlab$
 LD_PRELOAD=/lib/x86_64-linux-gnu/libgfortran.so.5 ./matlab_ls_test  
-tao_smonitor -prob_id 5 -info
[0] PetscInitialize(): PETSc successfully started: number of processors = 1
[0] PetscGetHostName(): Rejecting domainname, likely is NIS 
compute-386-07.(none)
[0] PetscInitialize(): Running on machine: compute-386-07
Running problem 5
[0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max 
tags = 268435455
[0] PetscMatlabEngineCreate(): Starting MATLAB engine with command 
/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab 
-glnxa64 -nodisplay  -nosplash 
[0] PetscMatlabEngineCreate(): Started MATLAB engine
[0] PetscMatlabEngineEvaluate(): Evaluating MATLAB string: TestingInitialize
[0] PetscMatlabEngineEvaluate(): Done evaluating Matlab string: 
TestingInitialize
5


<<<< Now verify if the above matlab command actually works for you - on this 
machine..>>>>>

balay@compute-386-07:/scratch/balay/petsc/src/tao/leastsquares/tutorials/matlab$
 /nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab 
-glnxa64 -nodisplay  -nosplash


   < M A T L A B (R) >

 Copyright 1984-2021 The MathWorks, Inc.

 R2021a (9.10.0.1602886) 64-bit (glnxa64)

February 17, 2021

 
To get started, type doc.
For product information, visit www.mathworks.com.
 
>> 


On Sat, 2 Sep 2023, Satish Balay via petsc-users wrote:

> Please don't cc: both petsc-users and petsc-maint [reverting thread to 
> petsc-users only]
> 
> I'm not sure what is happening here. Can you send the corresponding 
> configure.log, make.log [compressed]?
> 
> Here is my attempt to reproduce (with petsc-3.13) on Ubuntu-20.04, with 
> Matlab-R2021a - and that works:
> 
> balay@compute-386-07:/scratch/balay/petsc$ git branch
>   main
>   release
> * release-3.13
> balay@compute-386-07:/scratch/balay/petsc$ module load matlab/R2021a
> balay@compute-386-07:/scratch/balay/petsc$ which matlab
> /nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab
> balay@compute-386-07:/scratch/balay/petsc$ ./configure 
> --with-mpi-dir=/nfs/gce/projects/petsc/soft/u20.04/mpich-4.0.2/ COPTFLAGS=-O3 
> CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 PETSC_ARCH=linux-opt --with-debugging=0 
> --with-x=0 
> --with-matlab-dir=/nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a
>  
> --with-blaslapack-dir=/nfs/gce/software/custom/l

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-02 Thread Satish Balay via petsc-users
0]PETSC ERROR: End of Error Message ---send entire
> error message to petsc-ma...@mcs.anl.gov--
> # --
> # MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
> # with errorcode 126076.
> #
> # NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> # You may or may not see output from other processes, depending on
> # exactly when Open MPI kills them.
> # --
> 
> I request you to help me to sort out this error.
> 
> Thanks
> Srinivas
> 
> 
> On Fri, Sep 1, 2023 at 11:42 PM Amneet Bhalla  wrote:
> 
> > I think there should be two hyphens in
> > -with-blaslapack-dir=/usr/local/MATLAB/R2020b
> >
> > i.e.:
> >
> > --with-blaslapack-dir=/usr/local/MATLAB/R2020b
> >
> > On Fri, Sep 1, 2023 at 10:25 AM INTURU SRINIVAS 20PHD0548 via petsc-users <
> > petsc-users@mcs.anl.gov> wrote:
> >
> >> Thank you Sathish.I will try this
> >>
> >> On Fri, Sep 1, 2023, 22:53 Satish Balay  wrote:
> >>
> >>> yes [and remove fblaslapack. don't know if hypre will work here].
> >>>
> >>> i.e:
> >>>
> >>> ./configure --with-mpi-dir=/home/vit/sfw/linux/openmpi/4.1.4
> >>> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 PETSC_ARCH=linux-opt
> >>> --with-debugging=0 --with-x=0 \
> >>> --with-matlab-dir=/usr/local/MATLAB/R2020b --with-matlab-engine=1
> >>> -with-blaslapack-dir=/usr/local/MATLAB/R2020b 
> >>> --known-64-bit-blas-indices=1
> >>>
> >>> Satish
> >>>
> >>> On Fri, 1 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> >>>
> >>> > Hi Satish,
> >>> >
> >>> > -with-blaslapack-dir=/path/to/matlab_dir
> >>> > --known-64-bit-blas-indices=1
> >>> >
> >>> > Is this what you are suggesting?
> >>> >
> >>> > On Fri, Sep 1, 2023, 20:42 Satish Balay  wrote:
> >>> >
> >>> > > Also:
> >>> > >
> >>> > > '-known-64-bit-blas-indices=1',
> >>> > >
> >>> > > Note: most externalpackages might not work in this mode.
> >>> > >
> >>> > > [we can't really over come such dependency/conflicts across packages]
> >>> > >
> >>> > > Satish
> >>> > >
> >>> > > On Fri, 1 Sep 2023, Satish Balay via petsc-users wrote:
> >>> > >
> >>> > > > Here is the matlab test that runs in CI
> >>> > > >
> >>> > > > https://gitlab.com/petsc/petsc/-/jobs/4904566768
> >>> > > >
> >>> > > > config/examples/arch-ci-linux-matlab-ilp64.py
> >>> > > >
> >>> > > > # Note: regular BLAS [with 32-bit integers] conflict with
> >>> > > > # MATLAB BLAS - hence requiring -known-64-bit-blas-indices=1
> >>> > > >
> >>> > > > Ah - so you need to use the ilp64 blas/lapack with matlab  - to
> >>> have a
> >>> > > compatible build
> >>> > > >
> >>> > > > '--with-blaslapack-dir='+matlab_dir,
> >>> > > >
> >>> > > > Satish
> >>> > > >
> >>> > > >
> >>> > > > On Fri, 1 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users
> >>> wrote:
> >>> > > >
> >>> > > > > Hi Amneet,
> >>> > > > >
> >>> > > > > Without libmesh, even for PETSc with MATLAB is not working. It is
> >>> > > showing
> >>> > > > > error for both 3.13.4 and 3.17.5 versions.
> >>> > > > >
> >>> > > > > I am trying to install IBAMR in HPC cluster with libmesh and
> >>> this is
> >>> > > for
> >>> > > > > general usage not only for WEC. This I tried without linking
> >>> Matlab
> >>> > > with
> >>> > > > > PETSc. Here also I got the error while configuring libmesh 1.6.2
> >>> that
> >>> > > PETSc
> >>> > > > > was not found but --enable-petsc-required is specified.
> >>> > > > >
> >>> > > > > The reason for specifically ask

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-01 Thread Satish Balay via petsc-users
yes [and remove fblaslapack. don't know if hypre will work here].

i.e:

./configure --with-mpi-dir=/home/vit/sfw/linux/openmpi/4.1.4 COPTFLAGS=-O3 
CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 PETSC_ARCH=linux-opt --with-debugging=0 
--with-x=0 \
--with-matlab-dir=/usr/local/MATLAB/R2020b --with-matlab-engine=1 
-with-blaslapack-dir=/usr/local/MATLAB/R2020b --known-64-bit-blas-indices=1

Satish

On Fri, 1 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:

> Hi Satish,
> 
> -with-blaslapack-dir=/path/to/matlab_dir
> --known-64-bit-blas-indices=1
> 
> Is this what you are suggesting?
> 
> On Fri, Sep 1, 2023, 20:42 Satish Balay  wrote:
> 
> > Also:
> >
> > '-known-64-bit-blas-indices=1',
> >
> > Note: most externalpackages might not work in this mode.
> >
> > [we can't really over come such dependency/conflicts across packages]
> >
> > Satish
> >
> > On Fri, 1 Sep 2023, Satish Balay via petsc-users wrote:
> >
> > > Here is the matlab test that runs in CI
> > >
> > > https://gitlab.com/petsc/petsc/-/jobs/4904566768
> > >
> > > config/examples/arch-ci-linux-matlab-ilp64.py
> > >
> > > # Note: regular BLAS [with 32-bit integers] conflict with
> > > # MATLAB BLAS - hence requiring -known-64-bit-blas-indices=1
> > >
> > > Ah - so you need to use the ilp64 blas/lapack with matlab  - to have a
> > compatible build
> > >
> > > '--with-blaslapack-dir='+matlab_dir,
> > >
> > > Satish
> > >
> > >
> > > On Fri, 1 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> > >
> > > > Hi Amneet,
> > > >
> > > > Without libmesh, even for PETSc with MATLAB is not working. It is
> > showing
> > > > error for both 3.13.4 and 3.17.5 versions.
> > > >
> > > > I am trying to install IBAMR in HPC cluster with libmesh and this is
> > for
> > > > general usage not only for WEC. This I tried without linking Matlab
> > with
> > > > PETSc. Here also I got the error while configuring libmesh 1.6.2 that
> > PETSc
> > > > was not found but --enable-petsc-required is specified.
> > > >
> > > > The reason for specifically asking for libmesh with PETSc and Matlab
> > is in
> > > > the cfd-mpc-wecs application for complicated geometries it is
> > mentioned to
> > > > build cfd-mpc-wecs with libmesh construct.
> > > >
> > > > For the past 10 days I am trying these various things and ended up with
> > > > errors.
> > > > I want to solve this and proceed further to do my research work.
> > > >
> > > > Please help me to solve this as soon as possible.
> > > >
> > > >
> > > > Thanks and Regards
> > > > Srinivas
> > > >
> > > > On Fri, Sep 1, 2023, 19:48 Amneet Bhalla 
> > wrote:
> > > >
> > > > > Hi Srinivas,
> > > > >
> > > > > As discussed earlier you don’t need libMesh for the WEC application.
> > You
> > > > > can just work with PETSc 3.17.5 that builds with Matlab.
> > > > >
> > > > > Do you have a specific reason for wanting to build libMesh?
> > > > >
> > > > > Thanks,
> > > > > —Amneet
> > > > >
> > > > > On Thu, Aug 31, 2023 at 10:16 PM INTURU SRINIVAS 20PHD0548 via
> > petsc-users
> > > > >  wrote:
> > > > >
> > > > >> Hi,
> > > > >>
> > > > >> I ran "make all" for petsc 3.13.4 by removing all occurrences of
> > "-lut
> > > > >> -licudata -licui18n -licuuc". When I ran "make test" got the
> > following
> > > > >> errors
> > > > >> in petsc/3.13.4/linux-debug:
> > > > >> TEST
> > > > >>
> > linux-debug/tests/counts/tao_leastsquares_tutorials_matlab-matlab_ls_test.counts
> > > > >> not ok tao_leastsquares_tutorials_matlab-matlab_ls_test # Error
> > code: 124
> > > > >> # Running problem 5
> > > > >> # [0]PETSC ERROR: - Error Message
> > > > >> --
> > > > >> # [0]PETSC ERROR: Error in external library
> > > > >> # [0]PETSC ERROR: Unable to start MATLAB engine on
> > > > >> # [0]PETSC ERROR: See
> > > > >>

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-01 Thread Satish Balay via petsc-users
Also:

'-known-64-bit-blas-indices=1',

Note: most externalpackages might not work in this mode.

[we can't really over come such dependency/conflicts across packages]

Satish

On Fri, 1 Sep 2023, Satish Balay via petsc-users wrote:

> Here is the matlab test that runs in CI 
> 
> https://gitlab.com/petsc/petsc/-/jobs/4904566768
> 
> config/examples/arch-ci-linux-matlab-ilp64.py
> 
> # Note: regular BLAS [with 32-bit integers] conflict with
> # MATLAB BLAS - hence requiring -known-64-bit-blas-indices=1
> 
> Ah - so you need to use the ilp64 blas/lapack with matlab  - to have a 
> compatible build
> 
> '--with-blaslapack-dir='+matlab_dir,
> 
> Satish
> 
> 
> On Fri, 1 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> 
> > Hi Amneet,
> > 
> > Without libmesh, even for PETSc with MATLAB is not working. It is showing
> > error for both 3.13.4 and 3.17.5 versions.
> > 
> > I am trying to install IBAMR in HPC cluster with libmesh and this is for
> > general usage not only for WEC. This I tried without linking Matlab with
> > PETSc. Here also I got the error while configuring libmesh 1.6.2 that PETSc
> > was not found but --enable-petsc-required is specified.
> > 
> > The reason for specifically asking for libmesh with PETSc and Matlab is in
> > the cfd-mpc-wecs application for complicated geometries it is mentioned to
> > build cfd-mpc-wecs with libmesh construct.
> > 
> > For the past 10 days I am trying these various things and ended up with
> > errors.
> > I want to solve this and proceed further to do my research work.
> > 
> > Please help me to solve this as soon as possible.
> > 
> > 
> > Thanks and Regards
> > Srinivas
> > 
> > On Fri, Sep 1, 2023, 19:48 Amneet Bhalla  wrote:
> > 
> > > Hi Srinivas,
> > >
> > > As discussed earlier you don’t need libMesh for the WEC application. You
> > > can just work with PETSc 3.17.5 that builds with Matlab.
> > >
> > > Do you have a specific reason for wanting to build libMesh?
> > >
> > > Thanks,
> > > —Amneet
> > >
> > > On Thu, Aug 31, 2023 at 10:16 PM INTURU SRINIVAS 20PHD0548 via petsc-users
> > >  wrote:
> > >
> > >> Hi,
> > >>
> > >> I ran "make all" for petsc 3.13.4 by removing all occurrences of "-lut
> > >> -licudata -licui18n -licuuc". When I ran "make test" got the following
> > >> errors
> > >> in petsc/3.13.4/linux-debug:
> > >> TEST
> > >> linux-debug/tests/counts/tao_leastsquares_tutorials_matlab-matlab_ls_test.counts
> > >> not ok tao_leastsquares_tutorials_matlab-matlab_ls_test # Error code: 124
> > >> # Running problem 5
> > >> # [0]PETSC ERROR: - Error Message
> > >> --
> > >> # [0]PETSC ERROR: Error in external library
> > >> # [0]PETSC ERROR: Unable to start MATLAB engine on
> > >> # [0]PETSC ERROR: See
> > >> https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> > >> shooting.
> > >> # [0]PETSC ERROR: Petsc Release Version 3.13.4, Aug 01, 2020
> > >> # [0]PETSC ERROR: ../matlab_ls_test on a linux-debug named MB108SMEC028
> > >> by vit Fri Sep  1 10:01:11 2023
> > >> # [0]PETSC ERROR: Configure options
> > >> --CC=/home/vit/sfw/linux/openmpi/4.1.4/bin/mpicc
> > >> --CXX=/home/vit/sfw/linux/openmpi/4.1.4/bin/mpicxx
> > >> --FC=/home/vit/sfw/linux/openmpi/4.1.4/bin/mpif90 --with-debugging=1
> > >> --download-hypre=1 --download-fblaslapack=1 --with-x=0
> > >> --with-matlab-dir=/usr/local/MATLAB/R2020b/ --with-matlab-engine=1
> > >> --with-matlab-engine-dir=/usr/local/MATLAB/R2020b/extern/engines/
> > >> # [0]PETSC ERROR: #1 PetscMatlabEngineCreate() line 67 in
> > >> /home/vit/sfw/petsc/3.13.4/src/sys/classes/matlabengine/matlab.c
> > >> # [0]PETSC ERROR: #2 main() line 126 in
> > >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c
> > >> # [0]PETSC ERROR: PETSc Option Table entries:
> > >> # [0]PETSC ERROR: -prob_id 5
> > >> # [0]PETSC ERROR: -tao_smonitor
> > >> # [0]PETSC ERROR: End of Error Message ---send entire
> > >> error message to petsc-ma...@mcs.anl.gov--
> > >> #
> > >> ---

Re: [petsc-users] Error while building PETSc with MATLAB

2023-09-01 Thread Satish Balay via petsc-users
Here is the matlab test that runs in CI 

https://gitlab.com/petsc/petsc/-/jobs/4904566768

config/examples/arch-ci-linux-matlab-ilp64.py

# Note: regular BLAS [with 32-bit integers] conflict with
# MATLAB BLAS - hence requiring -known-64-bit-blas-indices=1

Ah - so you need to use the ilp64 blas/lapack with matlab  - to have a 
compatible build

'--with-blaslapack-dir='+matlab_dir,

Satish


On Fri, 1 Sep 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:

> Hi Amneet,
> 
> Without libmesh, even for PETSc with MATLAB is not working. It is showing
> error for both 3.13.4 and 3.17.5 versions.
> 
> I am trying to install IBAMR in HPC cluster with libmesh and this is for
> general usage not only for WEC. This I tried without linking Matlab with
> PETSc. Here also I got the error while configuring libmesh 1.6.2 that PETSc
> was not found but --enable-petsc-required is specified.
> 
> The reason for specifically asking for libmesh with PETSc and Matlab is in
> the cfd-mpc-wecs application for complicated geometries it is mentioned to
> build cfd-mpc-wecs with libmesh construct.
> 
> For the past 10 days I am trying these various things and ended up with
> errors.
> I want to solve this and proceed further to do my research work.
> 
> Please help me to solve this as soon as possible.
> 
> 
> Thanks and Regards
> Srinivas
> 
> On Fri, Sep 1, 2023, 19:48 Amneet Bhalla  wrote:
> 
> > Hi Srinivas,
> >
> > As discussed earlier you don’t need libMesh for the WEC application. You
> > can just work with PETSc 3.17.5 that builds with Matlab.
> >
> > Do you have a specific reason for wanting to build libMesh?
> >
> > Thanks,
> > —Amneet
> >
> > On Thu, Aug 31, 2023 at 10:16 PM INTURU SRINIVAS 20PHD0548 via petsc-users
> >  wrote:
> >
> >> Hi,
> >>
> >> I ran "make all" for petsc 3.13.4 by removing all occurrences of "-lut
> >> -licudata -licui18n -licuuc". When I ran "make test" got the following
> >> errors
> >> in petsc/3.13.4/linux-debug:
> >> TEST
> >> linux-debug/tests/counts/tao_leastsquares_tutorials_matlab-matlab_ls_test.counts
> >> not ok tao_leastsquares_tutorials_matlab-matlab_ls_test # Error code: 124
> >> # Running problem 5
> >> # [0]PETSC ERROR: - Error Message
> >> --
> >> # [0]PETSC ERROR: Error in external library
> >> # [0]PETSC ERROR: Unable to start MATLAB engine on
> >> # [0]PETSC ERROR: See
> >> https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> >> shooting.
> >> # [0]PETSC ERROR: Petsc Release Version 3.13.4, Aug 01, 2020
> >> # [0]PETSC ERROR: ../matlab_ls_test on a linux-debug named MB108SMEC028
> >> by vit Fri Sep  1 10:01:11 2023
> >> # [0]PETSC ERROR: Configure options
> >> --CC=/home/vit/sfw/linux/openmpi/4.1.4/bin/mpicc
> >> --CXX=/home/vit/sfw/linux/openmpi/4.1.4/bin/mpicxx
> >> --FC=/home/vit/sfw/linux/openmpi/4.1.4/bin/mpif90 --with-debugging=1
> >> --download-hypre=1 --download-fblaslapack=1 --with-x=0
> >> --with-matlab-dir=/usr/local/MATLAB/R2020b/ --with-matlab-engine=1
> >> --with-matlab-engine-dir=/usr/local/MATLAB/R2020b/extern/engines/
> >> # [0]PETSC ERROR: #1 PetscMatlabEngineCreate() line 67 in
> >> /home/vit/sfw/petsc/3.13.4/src/sys/classes/matlabengine/matlab.c
> >> # [0]PETSC ERROR: #2 main() line 126 in
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c
> >> # [0]PETSC ERROR: PETSc Option Table entries:
> >> # [0]PETSC ERROR: -prob_id 5
> >> # [0]PETSC ERROR: -tao_smonitor
> >> # [0]PETSC ERROR: End of Error Message ---send entire
> >> error message to petsc-ma...@mcs.anl.gov--
> >> #
> >> --
> >> # MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
> >> # with errorcode 126076.
> >> #
> >> # NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> >> # You may or may not see output from other processes, depending on
> >> # exactly when Open MPI kills them.
> >> #
> >> --
> >>  ok tao_leastsquares_tutorials_matlab-matlab_ls_test # SKIP Command
> >> failed so no diff
> >>
> >> in petsc/3.13.4/linux-opt
> >>  TEST
> >> linux-opt/tests/counts/tao_leastsquares_tutorials_matlab-matlab_ls_test.counts
> >> not ok tao_leastsquares_tutorials_matlab-matlab_ls_test # Error code: 124
> >> # Running problem 5
> >> # [0]PETSC ERROR: - Error Message
> >> --
> >> # [0]PETSC ERROR: Error in external library
> >> # [0]PETSC ERROR: Unable to start MATLAB engine on
> >> # [0]PETSC ERROR: See
> >> https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> >> shooting.
> >> # [0]PETSC ERROR: Petsc Release Version 3.13.4, Aug 01, 2020
> >> # [0]PETSC ERROR: ../matlab_ls_test on a linux-opt named MB108SMEC028 by
> >> vit Fri Sep  1 10:34:12 2023
> >> # [0]PETSC ERROR: Configure 

Re: [petsc-users] Configuring PETSc on Mac M1

2023-09-01 Thread Satish Balay via petsc-users
please send the correspond configure.log for this failure - perhaps to 
petsc-maint [to avoid sending large files to petsc-users mailing list]

BTW: We normally use xcode clang/clang++ with brew gfortran (with system 
blas/lapack) for MacOS builds

Satish

On Fri, 1 Sep 2023, Giselle Sosa Jones wrote:

> Hello,
> 
> I am trying to install PETSc on my M1 Mac and I keep encountering the
> following error when configuring:
> 
> Unknown Fortran name mangling: Are you sure the C and Fortran compilers are
> compatible?
> 
>   Perhaps one is 64 bit and one is 32 bit?
> 
> 
> Any ideas on how to fix this? All my compilers are 64 bit. I am using
> gfortran and gcc (installed with brew).
> 
> Thank you in advance.
> 



Re: [petsc-users] Error while building PETSc with MATLAB

2023-08-29 Thread Satish Balay via petsc-users
Well - you sent in libmesh log not petsc's configure.log/make.log for petsc-3.17

Anyway - with petsc-3.13 - you have:



Matlab:
  Includes: -I/usr/local/MATLAB/R2020b/extern/include
  /usr/local/MATLAB/R2020b
MatlabEngine:
  Library:  
-Wl,-rpath,/usr/local/MATLAB/R2020b/sys/os/glnxa64:/usr/local/MATLAB/R2020b/bin/glnxa64:/usr/local/MATLAB/R2020b/extern/lib/glnxa64
 -L/usr/local/MATLAB/R2020b/bin/glnxa64 
-L/usr/local/MATLAB/R2020b/extern/lib/glnxa64 -leng -lmex -lmx -lmat -lut 
-licudata -licui18n -licuuc
  Language used to compile PETSc: C
<

With petsc-3.19 (and matlab-R2022a) - we are seeing:

https://gitlab.com/petsc/petsc/-/jobs/4904566768

>>>
Matlab:
  Includes:   
-I/nfs/gce/software/custom/linux-ubuntu22.04-x86_64/matlab/R2022a/extern/include
  Libraries:  
-Wl,-rpath,/nfs/gce/software/custom/linux-ubuntu22.04-x86_64/matlab/R2022a/bin/glnxa64
 -L/nfs/gce/software/custom/linux-ubuntu22.04-x86_64/matlab/R2022a/bin/glnxa64 
-leng -lmex -lmx -lmat
  Executable: /nfs/gce/software/custom/linux-ubuntu22.04-x86_64/matlab/R2022a
  mex: /nfs/gce/software/custom/linux-ubuntu22.04-x86_64/matlab/R2022a/bin/mex
  matlab: 
/nfs/gce/software/custom/linux-ubuntu22.04-x86_64/matlab/R2022a/bin/matlab 
-glnxa64
<<<

I.e "-lut -licudata -licui18n -licuuc" are not preset here. This might be a 
change wrt newer matlab versions.

You can:

- edit /home/vit/sfw/petsc/3.13.4/linux-opt/lib/petsc/conf/petscvariables and 
remove all occurrences of "-lut -licudata -licui18n -licuuc"
- now run 'make all' in '/home/vit/sfw/petsc/3.13.4'

And see if the build works now.

Satish

On Tue, 29 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:

> I am sharing the log files while building petsc3.13.4 with matlab and also
> the log file while building libmesh with petsc3.17.5 and matlab. Building
> petsc 3.17.5 with matlab was done successfully. But libmesh is not able to
> find the petsc
> Please find the attachments.
> 
> On Tue, Aug 29, 2023 at 7:31 PM Satish Balay  wrote:
> 
> > Send configure.log, make.log from both petsc-3.13 and 3.17 [or 3.19].
> >
> > [you can gzip them to make the logs friendly to mailing list - or send
> > them to petsc-maint]
> >
> > And does test suite work with 3.17? [or 3.19?]
> >
> > Satish
> >
> > On Tue, 29 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> >
> > > I am sharing the make.log file while building petsc-3.13.4 with Matlab.
> > > Please find the attachment and do the needful.
> > >
> > > On Tue, Aug 29, 2023 at 10:19 AM INTURU SRINIVAS 20PHD0548 <
> > > inturu.srinivas2...@vitstudent.ac.in> wrote:
> > >
> > > > I tried with petsc-3.17.5. During building of libmesh, the error shows
> > > > petsc was not found
> > > >
> > > > On Mon, Aug 28, 2023 at 9:43 PM Satish Balay 
> > wrote:
> > > >
> > > >> https://ibamr.github.io/linux says petsc-3.17
> > > >>
> > > >> Here you are using 3.13
> > > >>
> > > >> Can you retry with petsc-3.17.5?
> > > >>
> > > >> Satish
> > > >>
> > > >> On Mon, 28 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> > > >>
> > > >> > Hello,
> > > >> >
> > > >> > I want to build PETSc with MATLAB for working on the simulation
> > using
> > > >> IBAMR
> > > >> > open software. While building the PETSc, using the following
> > > >> >
> > > >> > export PETSC_DIR=$PWD
> > > >> > export PETSC_ARCH=linux-debug
> > > >> > ./configure \
> > > >> >   --CC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicc \
> > > >> >   --CXX=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicxx \
> > > >> >   --FC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpif90 \
> > > >> >   --with-debugging=1 \
> > > >> >   --download-hypre=1 \
> > > >> >   --download-fblaslapack=1 \
> > > >> >   --with-x=0 \
> > > >> >   --with-matlab-dir=/usr/local/MATLAB/R2020b/
> > > >> >   --with-matlab-engine=1
> > > >> >   --with-matlab-engine-dir=/usr/local/MATLAB/R2020b/extern/engines/
> > > >> >
> > > >> > make -j4
> > > >> > make -j4 test
> > > >> >
> > > >> > I got the following error
> > > >> > CLINKER
> > > >> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test
> > > >> > /usr/bin/ld:
> > > >> >
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> > > >> > function `EvaluateResidual':
> > > >> >
> > > >>
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:32:
> > > >> > undefined reference to `PetscMatlabEnginePut'
> > > >> > /usr/bin/ld:
> > > >> >
> > > >>
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:33:
> > > >> > undefined reference to `PetscMatlabEngineEvaluate'
> > > >> > /usr/bin/ld:
> > > >> >
> > > >>
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:35:
> > > >> > undefined reference to `PetscMatlabEngineGet'
> > > >> > /usr/bin/ld:
> > > >> >
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> > > >> > function `EvaluateJacobian':
> > > >> >
> > > >>
> > 

Re: [petsc-users] Error while building PETSc with MATLAB

2023-08-29 Thread Satish Balay via petsc-users
Send configure.log, make.log from both petsc-3.13 and 3.17 [or 3.19].

[you can gzip them to make the logs friendly to mailing list - or send them to 
petsc-maint]

And does test suite work with 3.17? [or 3.19?]

Satish

On Tue, 29 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:

> I am sharing the make.log file while building petsc-3.13.4 with Matlab.
> Please find the attachment and do the needful.
> 
> On Tue, Aug 29, 2023 at 10:19 AM INTURU SRINIVAS 20PHD0548 <
> inturu.srinivas2...@vitstudent.ac.in> wrote:
> 
> > I tried with petsc-3.17.5. During building of libmesh, the error shows
> > petsc was not found
> >
> > On Mon, Aug 28, 2023 at 9:43 PM Satish Balay  wrote:
> >
> >> https://ibamr.github.io/linux says petsc-3.17
> >>
> >> Here you are using 3.13
> >>
> >> Can you retry with petsc-3.17.5?
> >>
> >> Satish
> >>
> >> On Mon, 28 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> >>
> >> > Hello,
> >> >
> >> > I want to build PETSc with MATLAB for working on the simulation using
> >> IBAMR
> >> > open software. While building the PETSc, using the following
> >> >
> >> > export PETSC_DIR=$PWD
> >> > export PETSC_ARCH=linux-debug
> >> > ./configure \
> >> >   --CC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicc \
> >> >   --CXX=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicxx \
> >> >   --FC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpif90 \
> >> >   --with-debugging=1 \
> >> >   --download-hypre=1 \
> >> >   --download-fblaslapack=1 \
> >> >   --with-x=0 \
> >> >   --with-matlab-dir=/usr/local/MATLAB/R2020b/
> >> >   --with-matlab-engine=1
> >> >   --with-matlab-engine-dir=/usr/local/MATLAB/R2020b/extern/engines/
> >> >
> >> > make -j4
> >> > make -j4 test
> >> >
> >> > I got the following error
> >> > CLINKER
> >> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test
> >> > /usr/bin/ld:
> >> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> >> > function `EvaluateResidual':
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:32:
> >> > undefined reference to `PetscMatlabEnginePut'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:33:
> >> > undefined reference to `PetscMatlabEngineEvaluate'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:35:
> >> > undefined reference to `PetscMatlabEngineGet'
> >> > /usr/bin/ld:
> >> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> >> > function `EvaluateJacobian':
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:46:
> >> > undefined reference to `PetscMatlabEnginePut'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:47:
> >> > undefined reference to `PetscMatlabEngineEvaluate'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:49:
> >> > undefined reference to `PetscMatlabEngineGet'
> >> > /usr/bin/ld:
> >> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> >> > function `TaoPounders':
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:75:
> >> > undefined reference to `PetscMatlabEngineGet'
> >> > /usr/bin/ld:
> >> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> >> > function `main':
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:126:
> >> > undefined reference to `PetscMatlabEngineCreate'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:127:
> >> > undefined reference to `PetscMatlabEngineEvaluate'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:139:
> >> > undefined reference to `PetscMatlabEngineEvaluate'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:140:
> >> > undefined reference to `PetscMatlabEngineGetArray'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:142:
> >> > undefined reference to `PetscMatlabEngineGetArray'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:144:
> >> > undefined reference to `PetscMatlabEngineGetArray'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:146:
> >> > undefined reference to `PetscMatlabEngineGetArray'
> >> > /usr/bin/ld:
> >> >
> >> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:148:
> >> > undefined reference to `PetscMatlabEngineGetArray'
> >> > /usr/bin/ld:
> >> >
> >> 

Re: [petsc-users] Error while building PETSc with MATLAB

2023-08-28 Thread Satish Balay via petsc-users
Also - the instructions don't say if matlab is required.

So perhaps you might want to try an install without matlab - and see if you are 
able to get IBAMR working.

Satish

On Mon, 28 Aug 2023, Satish Balay via petsc-users wrote:

> https://ibamr.github.io/linux says petsc-3.17
> 
> Here you are using 3.13
> 
> Can you retry with petsc-3.17.5?
> 
> Satish
> 
> On Mon, 28 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> 
> > Hello,
> > 
> > I want to build PETSc with MATLAB for working on the simulation using IBAMR
> > open software. While building the PETSc, using the following
> > 
> > export PETSC_DIR=$PWD
> > export PETSC_ARCH=linux-debug
> > ./configure \
> >   --CC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicc \
> >   --CXX=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicxx \
> >   --FC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpif90 \
> >   --with-debugging=1 \
> >   --download-hypre=1 \
> >   --download-fblaslapack=1 \
> >   --with-x=0 \
> >   --with-matlab-dir=/usr/local/MATLAB/R2020b/
> >   --with-matlab-engine=1
> >   --with-matlab-engine-dir=/usr/local/MATLAB/R2020b/extern/engines/
> > 
> > make -j4
> > make -j4 test
> > 
> > I got the following error
> > CLINKER linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test
> > /usr/bin/ld:
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> > function `EvaluateResidual':
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:32:
> > undefined reference to `PetscMatlabEnginePut'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:33:
> > undefined reference to `PetscMatlabEngineEvaluate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:35:
> > undefined reference to `PetscMatlabEngineGet'
> > /usr/bin/ld:
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> > function `EvaluateJacobian':
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:46:
> > undefined reference to `PetscMatlabEnginePut'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:47:
> > undefined reference to `PetscMatlabEngineEvaluate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:49:
> > undefined reference to `PetscMatlabEngineGet'
> > /usr/bin/ld:
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> > function `TaoPounders':
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:75:
> > undefined reference to `PetscMatlabEngineGet'
> > /usr/bin/ld:
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> > function `main':
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:126:
> > undefined reference to `PetscMatlabEngineCreate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:127:
> > undefined reference to `PetscMatlabEngineEvaluate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:139:
> > undefined reference to `PetscMatlabEngineEvaluate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:140:
> > undefined reference to `PetscMatlabEngineGetArray'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:142:
> > undefined reference to `PetscMatlabEngineGetArray'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:144:
> > undefined reference to `PetscMatlabEngineGetArray'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:146:
> > undefined reference to `PetscMatlabEngineGetArray'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:148:
> > undefined reference to `PetscMatlabEngineGetArray'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:154:
> > undefined reference to `PetscMatlabEngineEvaluate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:157:
> > undefined reference to `PetscMatlabEngineEvaluate'
> > /usr/bin/ld:
> > /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:158:
> > undefined reference to `PetscMatlabEngineDestroy'
> > 
> > collect2: error: ld returned 1 exit status
> > make: *** [gmakefile.test:185:
> > linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test] Error 1
> > make: *** Waiting for unfinished jobs
> > 
> > Please help me to solve this issue
> > 
> > Thank you
> > Srinivas
> > 
> > 
> 



Re: [petsc-users] Error while building PETSc with MATLAB

2023-08-28 Thread Satish Balay via petsc-users
https://ibamr.github.io/linux says petsc-3.17

Here you are using 3.13

Can you retry with petsc-3.17.5?

Satish

On Mon, 28 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:

> Hello,
> 
> I want to build PETSc with MATLAB for working on the simulation using IBAMR
> open software. While building the PETSc, using the following
> 
> export PETSC_DIR=$PWD
> export PETSC_ARCH=linux-debug
> ./configure \
>   --CC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicc \
>   --CXX=$HOME/sfw/linux/openmpi/4.1.4/bin/mpicxx \
>   --FC=$HOME/sfw/linux/openmpi/4.1.4/bin/mpif90 \
>   --with-debugging=1 \
>   --download-hypre=1 \
>   --download-fblaslapack=1 \
>   --with-x=0 \
>   --with-matlab-dir=/usr/local/MATLAB/R2020b/
>   --with-matlab-engine=1
>   --with-matlab-engine-dir=/usr/local/MATLAB/R2020b/extern/engines/
> 
> make -j4
> make -j4 test
> 
> I got the following error
> CLINKER linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test
> /usr/bin/ld:
> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> function `EvaluateResidual':
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:32:
> undefined reference to `PetscMatlabEnginePut'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:33:
> undefined reference to `PetscMatlabEngineEvaluate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:35:
> undefined reference to `PetscMatlabEngineGet'
> /usr/bin/ld:
> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> function `EvaluateJacobian':
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:46:
> undefined reference to `PetscMatlabEnginePut'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:47:
> undefined reference to `PetscMatlabEngineEvaluate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:49:
> undefined reference to `PetscMatlabEngineGet'
> /usr/bin/ld:
> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> function `TaoPounders':
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:75:
> undefined reference to `PetscMatlabEngineGet'
> /usr/bin/ld:
> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test.o: in
> function `main':
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:126:
> undefined reference to `PetscMatlabEngineCreate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:127:
> undefined reference to `PetscMatlabEngineEvaluate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:139:
> undefined reference to `PetscMatlabEngineEvaluate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:140:
> undefined reference to `PetscMatlabEngineGetArray'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:142:
> undefined reference to `PetscMatlabEngineGetArray'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:144:
> undefined reference to `PetscMatlabEngineGetArray'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:146:
> undefined reference to `PetscMatlabEngineGetArray'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:148:
> undefined reference to `PetscMatlabEngineGetArray'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:154:
> undefined reference to `PetscMatlabEngineEvaluate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:157:
> undefined reference to `PetscMatlabEngineEvaluate'
> /usr/bin/ld:
> /home/vit/sfw/petsc/3.13.4/src/tao/leastsquares/tutorials/matlab/matlab_ls_test.c:158:
> undefined reference to `PetscMatlabEngineDestroy'
> 
> collect2: error: ld returned 1 exit status
> make: *** [gmakefile.test:185:
> linux-debug/tests/tao/leastsquares/tutorials/matlab/matlab_ls_test] Error 1
> make: *** Waiting for unfinished jobs
> 
> Please help me to solve this issue
> 
> Thank you
> Srinivas
> 
> 



Re: [petsc-users] [petsc-maint] REQUESTING INVITATON FOR SLACK WORKSPACE OF PETSC

2023-08-23 Thread Satish Balay via petsc-users
Check: https://lists.mcs.anl.gov/pipermail/petsc-users/2023-July/049115.html

Also - best to not cross post to multiple lists.

Satish

On Wed, 23 Aug 2023, VAIBHAV BHANDARI wrote:

> Dear Sir/Mam,
> 
> I hope this email finds you well. I am writing to request an invitation to
> join the PETSc Slack Workspace. As a passionate enthusiast of parallel and
> scientific computing, I have been following the development and
> advancements of PETSc with great interest.
> I have been actively involved in *Topology Optimization* and believe that
> being a part of the PETSc Slack community would provide me with an avenue
> to share insights, seek advice, and learn from the experiences of fellow
> members.
> If possible, I kindly request an invitation to join the PETSc Slack
> Workspace. I assure you that I will adhere to all community guidelines and
> contribute positively to the discussions. I am excited about the prospect
> of connecting with experts and enthusiasts in the field and contributing to
> the mutual growth and understanding within the PETSc community.
> 
> Thank you for considering my request. I eagerly await the opportunity to
> engage with the PETSc community on Slack. Please feel free to reach out to
> me at [vaibha...@ce.iitr.ac.in] if you require any further information.
> 
> Looking forward to your positive response.
> 
> Best regards,
> Vaibhav Bhandari
> Ph.D. STudent
> IIT Roorkee
> 



Re: [petsc-users] Some questions about the directory structure

2023-08-21 Thread Satish Balay via petsc-users
On Mon, 21 Aug 2023, meator wrote:

> Hi. I'm trying to package PETSc using the tarball with documentation
> (https://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-with-docs-3.19.4.tar.gz)
> and I've got some questions about the structure of PETSc.
> 
> What are the contents of the /usr/lib/petsc directory in destdir for? This
> directory has two subrirectories: bin and conf. Why is the bin/ directory in
> lib/? lib/ should be for libraries.

balay@p1 /home/balay
$ find /usr/lib -name bin
/usr/lib/debug/bin
/usr/lib/debug/usr/bin
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.372.b07-6.fc38.x86_64/jre/bin
/usr/lib/jvm/java-17-openjdk-17.0.8.0.7-1.fc38.x86_64/bin
/usr/lib/jvm/java-11-openjdk-11.0.20.0.8-1.fc38.x86_64/bin
balay@p1 /home/balay
$ find /usr/lib64 -name bin
/usr/lib64/qt5/bin
/usr/lib64/R/bin


> Are the executables contained in
> /usr/lib/petsc/bin essential to the user or the developer (should this be in a
> -devel subpackage)?

Hm  - different scripts/utils have different propose. I guess most are useful 
from devel sub package.

> Some of the scripts don't have the executable bit
> (/usr/lib/petsc/bin/configureTAS.py, /usr/lib/petsc/bin/extract.py,
> /usr/lib/petsc/bin/petsc_tas_style.mplstyle, /usr/lib/petsc/bin/tasClasses.py,
> /usr/lib/petsc/bin/xml2flamegraph.py). What is their purpose?

I guess some of them can use some cleanup. extract.py likely belongs to 
bin/maint [and excluded from tarball..]

> 
> The /usr/lib/petsc/conf directory seems to be related to the build process. Is
> that correct?

These have makefiles that can be included from user/application makefiles - to 
get compile/link working seamlessly.


> If yes, I will delete the directory from the package because
> packages shouldn't include these things. This directory even includes
> uninstall.py which is undesirable for a packaged program because this is the
> package manager's job.

Sure - some files might not make sense to be included in a packaging system.

> 
> /usr/share/petsc looks like it contains additional info useful to the
> developers, therefore it should be in a -devel subpackage.

> 
> I see that the docs directory contains .buildinfo. Does this directory contain
> additional build artifacts (that should be removed)?

I guess some of these files should be excluded from tarball.

> 
> The main index.html of the documentation (from the tarball linked at the
> beginning of this e-mail) is invalid. It has all the menus but the main part
> of the page is blank. The raw HTML is cut off, there's no content and there
> are unclosed tags.

Hm - since the docs are primarily tested/used at petsc.org - some of that 
functionality probably doesn't work as raw html - and might need fixes.

Satish

> 
> Many of my questions may be trivial but I want to make sure to not break the
> package.
> 
> Thanks in advance
> 
> 



Re: [petsc-users] Configure AMGX

2023-08-18 Thread Satish Balay via petsc-users
Can you try the update in branch "balay/amgx-cuda-12"?

Satish

On Fri, 18 Aug 2023, Zisheng Ye wrote:

> Dear PETSc team
> 
> I am configuring AMGX package under the main branch with CUDA 12.1. But it 
> can't get through. Can you help to solve the problem? I have attached the 
> configure.log to the email.
> 
> Thanks,
> Zisheng
> 



Re: [petsc-users] PetscCall( ) in fortran

2023-08-18 Thread Satish Balay via petsc-users
I think gfortran defaults to fixed form for .F and free-form for .F90

This can be changed with FFLAGS=-ffree-form - but yeah - switching the suffix 
might be more suitable..

In addition - PETSc configure attempts to add in "-ffree-line-length-none 
-ffree-line-length-0" options - so that extra long source lines can be used 
[with the default PETSc makefiles].

Satish


On Thu, 17 Aug 2023, Sanjay Govindjee wrote:

> Thanks.
> 
> For what it is worth in regards to question (1), GNU Fortran (Homebrew GCC
> 11.3.0_2) 11.3.0 seems to need .F90 (as opposed to just .F).
> 
> -sanjay
> 
> 
> On 8/17/23 4:50 PM, Barry Smith wrote:
> >
> >> On Aug 17, 2023, at 7:44 PM, Sanjay Govindjee  wrote:
> >>
> >> Two questions about the PetscCall( ) etc. functionality in fortran:
> >>
> >> (1) To use this functionality, is it required to use a .F90 naming
> >> convention? or should I be able to use .F?
> > This likely depends on the compiler.
> >> (2) Is it permitted to use line continuation within these calls? For
> >> example something like
> >>
> >>PetscCallMPIA(MPI_Allreduce(localval,globalsum,1,&
> >>  MPIU_REAL,MPIU_SUM, PETSC_COMM_WORLD,ierr))
> >>
> >> or is it required to just have an extra long line? or is there an alternate
> >> syntax for continuation in this case?
> > Because PetscCallXXX() is a macro, it "breaks" if a continuation is
> > used, so yes, you will sometimes need long lines. Most Fortran compilers
> > have an option to allow infinitely long lines.
> >
> > Note that you can still use CHKERRQ() either all the time or in
> > situations where you "need" a continuation character.
> >
> >Barry
> >
> >
> >> -sanjay
> >>
> 



Re: [petsc-users] issue with multiple versions of mpi

2023-08-15 Thread Satish Balay via petsc-users
Do you get this error when you compile a PETSc example [with the
corresponding PETSc makefile]?

If not - you'll have to check the difference in compiler options
between this example compile - and your application.

Satish

On Tue, 15 Aug 2023, maitri ksh wrote:

> I was earlier using petsc with real-built, i tried configuring it for a
> complex environment using the same procedure (at least I think so) but with
> the addition of '--with-scalar-type=complex'. There appears to be no
> problem during the configuration process and its checks ('*configure.log*'
> is attached). But while compiling code, I ran into an error message
> concerning conflicts due to multiple versions of mpi. How do I resolve
> this?
> 



Re: [petsc-users] 32-bit vs 64-bit GPU support

2023-08-11 Thread Satish Balay via petsc-users
On Fri, 11 Aug 2023, Jed Brown wrote:

> Jacob Faibussowitsch  writes:
> 
> > More generally, it would be interesting to know the breakdown of installed 
> > CUDA versions for users. Unlike compilers etc, I suspect that cluster 
> > admins (and those running on local machines) are much more likely to be 
> > updating their CUDA toolkits to the latest versions as they often contain 
> > critical performance improvements.
> 
> One difference is that some sites (not looking at you at all, ALCF) still run 
> pretty ancient drivers and/or have broken GPU-aware MPI with all but a 
> specific ancient version of CUDA (OLCF, LLNL). With a normal compiler, you 
> can choose to use the latest version, but with CUDA, people are firmly stuck 
> on old versions.
> 

Well Nvidia keeps phasing out support for older GPUs in newer CUDA releases - 
so unless GPUs are upgraded - they can't really upgrade (to latest) CUDA 
versions ..

[this is in addition to the usual reasons admins don't do software upgrades... 
Ignore clusters - our CUDA CI machine has random stability issues - so we had 
to downgrade/freeze cuda/driver versions to keep the machine functional]

Satish



Re: [petsc-users] IMPI with Hypre

2023-08-08 Thread Satish Balay via petsc-users
Sure - but if 'module load icc' has gcc-4*' in path - that's a bug in the icc 
module spec [as that version is incompatible with its c++ support] . It should 
also load a compatible gcc version [via PATH - or via module dependencies]

if its implemented this way - then you won't have a broken icc - that requires 
a swap of gcc.

Satish

On Tue, 8 Aug 2023, Victor Eijkhout wrote:

> You say bug I say feature. Lmod has a way to mark modules as mutually 
> exclusive. That’s a decision of the way the site is set up. For most users 
> that’s a good idea.
> 
> For instance, if you load two compilers, and both have an MPI, how do you 
> decide which one is loaded by “load mpich”?
> 
> Etc. I’m sure thought has gone into this.
> 
> Victor.
> 
> 
> <<
> 


Re: [petsc-users] IMPI with Hypre

2023-08-08 Thread Satish Balay via petsc-users
If using modules - using 'module load gcc icc' [or equivalent] should normally 
work - but if the modules are setup such that loading icc unloads gcc - then I 
think that's a bug in this module setup..

[as icc has an (internal) dependency on gcc - so ignoring this dependency to 
remove a gcc module doesn't look correct to me]

Satish

On Tue, 8 Aug 2023, Victor Eijkhout wrote:

>   *   Its easier to just add the newer version of gcc/g++ compilers to PATH
> 
> Except that I do my path loading through environment modules (lmod version) 
> and they do not allow multiple compilers to be loaded at the same time.
> 
> But yes, that would work.
> 
> V.
> 
> From: Satish Balay 
> Date: Tuesday, August 8, 2023 at 11:20
> To: Victor Eijkhout 
> Cc: Barry Smith , Khaled Nabil Shar Abdelaziz 
> , petsc-users@mcs.anl.gov 
> Subject: Re: [petsc-users] IMPI with Hypre
> Its easier to just add the newer version of gcc/g++ compilers to PATH - and 
> icc will pick it up [without requiring -gcc-toolchain option]
> 
> export PATH=/location/of/newer/g++/bin:$PATH
> ./configure ...
> make ...
> 
> Satish
> 
> On Tue, 8 Aug 2023, Victor Eijkhout wrote:
> 
> > Maybe an option for specifying the explicit location of gcc version? The 
> > intel compiler has a “-gcc-toolchain” option for that.
> >
> > https://www.intel.com/content/www/us/en/docs/dpcpp-cpp-compiler/developer-guide-reference/2023-0/gcc-toolchain.html
> >
> > Victor.
> >
> >
> >
> >
> >> This message is from an external sender. Learn more about why this <<
> >> matters at https://links.utexas.edu/rtyclf.<<
> 


Re: [petsc-users] IMPI with Hypre

2023-08-08 Thread Satish Balay via petsc-users
Its easier to just add the newer version of gcc/g++ compilers to PATH - and icc 
will pick it up [without requiring -gcc-toolchain option]

export PATH=/location/of/newer/g++/bin:$PATH
./configure ...
make ...

Satish

On Tue, 8 Aug 2023, Victor Eijkhout wrote:

> Maybe an option for specifying the explicit location of gcc version? The 
> intel compiler has a “-gcc-toolchain” option for that.
> 
> https://www.intel.com/content/www/us/en/docs/dpcpp-cpp-compiler/developer-guide-reference/2023-0/gcc-toolchain.html
> 
> Victor.
> 
> 
> 
> 


Re: [petsc-users] compiler related error (configuring Petsc)

2023-08-01 Thread Satish Balay via petsc-users
> gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44)

Is it possible for you to use a newer version GNU compilers?

If not - your alternative is to build PETSc with --with-cxx=0 option

But then - you can't use --download-superlu_dist or any pkgs that need
c++ [you could try building them separately though]

Satish


On Tue, 1 Aug 2023, maitri ksh wrote:

> I am trying to compile petsc on a cluster ( x86_64-redhat-linux, '
> *configure.log'*  is attached herewith) . Initially I got an error related
> to 'C++11' flag, to troubleshoot this issue, I used 'CPPFLAGS' and
> 'CXXFLAGS' and could surpass the non-compliant error related to c++ compiler
>  but now it gives me another error 'cannot find a C preprocessor'. How to
> fix this?
> 



Re: [petsc-users] support for mixed block size matrices/AIM in PETSc?

2023-07-24 Thread Satish Balay via petsc-users
One way to boost performance [of MatVec etc] in sparse matrices with
blocks is by avoiding loading (from memory to cpu registers) of
row/col indices for the blocks - when possible.  [the performance
boost here come by the fact that the memory bandwidth requirements get
reduced]

So we have BAIJ matrix - where only one row,col value is used for a
block of values.  And as you've noted - this format requires a fixed
block size for the entire matrix.

Alternative support in PETSc is within AIJ matrix (enabled by default)
- where it scans consecutive rows - to see if they share the same
column values (i.e inodes) - and reuses them for these rows.

A matrix with varying blocks might benefit from this. To check - for ex:

[balay@pj01 tutorials]$ ./ex10 -f0 ~/datafiles/matrices/cfd.2.10 -f1 
~/datafiles/matrices/cfd.2.10 -info |grep -i inode
[0]  MatSeqAIJCheckInode(): Found 24576 nodes of 122880. Limit used: 5. 
Using Inode routines
[0]  MatSeqAIJCheckInode(): Found 24576 nodes of 122880. Limit used: 5. 
Using Inode routines

Satish

On Mon, 24 Jul 2023, Daniel Stone wrote:

> Hello PETSc Users/Developers,
> 
> A collegue of mine is looking into implementing an adaptive implicit method
> (AIM) over
> PETSc in our simulator. This has led to some interesting questions about
> what can
> be done with blocked matrices, which I'm not able to answer myself - does
> anyone have
> any insight?
> 
> Apparently it would be ideal if we could find a matrix (and vector) type
> that supports a kind
> of mixed block size:
> 
> "For AIM [...] we will have matrix elements of various shapes: 1x1, 1xN,
> Nx1 and NxN. [...]. The solution and residual will be a mix of 1 and N
> variable/cell block"
> 
> There are ideas for how to implement what we want using the
> fixed-block-size objects we
> understand well, but if anything like the above exists it would be very
> exciting.
> 
> 
> Thanks,
> 
> Daniel
> 



Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-21 Thread Satish Balay via petsc-users
; and include files - if built using configure and make, all the include 
> >>> files are conviniently copied
> >>> into hypre/src/hypre/include. This is not done for a cmake build - I had 
> >>> to do the copying myself. Maybe I missed one.
> >>> 
> >>> 
> >>> On shared vs. static - if there a clear way of telling which I've ended 
> >>> up with? I've checked the cmakelists for hypre and this seems to imply 
> >>> that not-shared is the default,
> >>> which I didn't change:
> >>> 
> >>> # Configuration options
> >>> option(HYPRE_ENABLE_SHARED   "Build a shared library" OFF)
> >>> option(HYPRE_ENABLE_BIGINT   "Use long long int for HYPRE_Int" 
> >>> OFF)
> >>> option(HYPRE_ENABLE_MIXEDINT "Use long long int for HYPRE_BigInt, 
> >>> int for HYPRE_INT" OFF)
> >>> []
> >>> 
> >>> 
> >>> checking again, I've noticed that the way that the stub-test fails is 
> >>> different depending on whether it's called from the config script or used 
> >>> in isolation - more details on that soon.
> >>> 
> >>> 
> >>> 
> >>> Thanks again,
> >>> 
> >>> Daniel
> >>> 
> >>> 
> >>> 
> >>> On Wed, Jul 19, 2023 at 4:58 PM Satish Balay via petsc-users 
> >>> mailto:petsc-users@mcs.anl.gov>> wrote:
> >>>> I think it should work with static libraries and 64bit compilers.
> >>>> 
> >>>> That's how I think --download-f2cblaslapack [etc] work.
> >>>> 
> >>>> Also it works with MS-MPI - even-though its a dll install, the library 
> >>>> stub provides this symbol somehow..
> >>>> 
> >>>> balay@ps5 /cygdrive/c/Program Files (x86)/Microsoft SDKs/MPI/Lib/x64
> >>>> $ nm -Ao msmpi.lib |grep " MPI_Init"
> >>>> msmpi.lib:msmpi.dll: T MPI_Init
> >>>> msmpi.lib:msmpi.dll: T MPI_Init_thread
> >>>> msmpi.lib:msmpi.dll: T MPI_Initialized
> >>>> 
> >>>> However - if the library symbol is somehow mangled - this configure mode 
> >>>> of checking library functions will fail.
> >>>> 
> >>>> Checking PETSc dll build:
> >>>> 
> >>>> balay@ps5 ~/petsc/arch-ci-mswin-uni/lib
> >>>> $ nm -Ao libpetsc.lib |grep MatCreateSeqAIJWithArrays
> >>>> libpetsc.lib:libpetsc.dll: I 
> >>>> __imp_MatCreateSeqAIJWithArrays
> >>>> libpetsc.lib:libpetsc.dll: T MatCreateSeqAIJWithArrays
> >>>> 
> >>>> It also has the unmangled symbol - so I guess this mode can work 
> >>>> generally with dlls.
> >>>> 
> >>>> Satish
> >>>> 
> >>>> 
> >>>> On Wed, 19 Jul 2023, Barry Smith wrote:
> >>>> 
> >>>>> 
> >>>>>  Satish,
> >>>>> 
> >>>>>   So it will always fail on Windows with Windows compilers (both with 
> >>>>> static and shared libraries)? Is this true for all PETSc external 
> >>>>> packages? If so, why does the installation documentation say that some 
> >>>>> external packages can work with Windows compilers? (Presumably PETSc 
> >>>>> cannot since the configure tests will fail).
> >>>>> 
> >>>>>  Barry
> >>>>> 
> >>>>> 
> >>>>>> On Jul 19, 2023, at 11:40 AM, Satish Balay  >>>>>> <mailto:ba...@mcs.anl.gov>> wrote:
> >>>>>> 
> >>>>>> BTW: Some explanation of configure:
> >>>>>> 
> >>>>>> It attempts the following on linux:
> >>>>>> 
> >>>>>>>>>>>> 
> >>>>>> Source:
> >>>>>> #include "confdefs.h"
> >>>>>> #include "conffix.h"
> >>>>>> /* Override any gcc2 internal prototype to avoid an error. */
> >>>>>> char HYPRE_IJMatrixCreate();
> >>>>>> static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }
> >>>>>> 
> >>>>>> int main(void) {
> >>>>>> _check_HYPRE_IJ

Re: [petsc-users] MPICH C++ compilers when using PETSC --with-cxx=0

2023-07-21 Thread Satish Balay via petsc-users
On Fri, 21 Jul 2023, Satish Balay via petsc-users wrote:

> Were you able to try Jacob's fix - so you could build with cxx?
> 
> Wrt building external pkgs - one way:
> 
> - first build pkgs:
> ./configure PETSC_ARCH=arch-pkgs --prefix=$HOME/soft/petsc-pkgs --with-cc=icc 
> --with-cxx=icpc --with-fc=ifort --download-mpich --download-suitesparse
> 
> - now build PETSc with these pkgs
> ./configure PETSC_ARCH=arch-mybuild --with-mpi-dir=$HOME/soft/petsc-pkgs 
> --with-cxx=0
> 
> [can't use --with-suitesparse-dir here - due to cxx dependencies within 
> configure - but you disable this dependency check in configure - by 
> commenting out one line of code]
> 
> Wrt configure options used - they are listed in configure.log - also in - for 
> ex:
> 
> [balay@pj01 petsc]$ ls arch-linux-c-debug/externalpackages/*/*petsc*
> arch-linux-c-debug/externalpackages/git.sowing/sowing.petscconf

Also at:

[balay@pj01 petsc]$ ls arch-linux-c-debug/lib/petsc/conf/pkg.conf*
arch-linux-c-debug/lib/petsc/conf/pkg.conf.sowing

Satish

> 
> Satish
> 
> On Fri, 21 Jul 2023, Barry Smith wrote:
> 
> > 
> >   You need to look in the configure.log you to see the exact 
> > configure/cmake command PETSc configure is using for each package it builds 
> > specific to that run of PETSc configure. We do not save them in some other 
> > place.
> > 
> > 
> > 
> > > On Jul 21, 2023, at 12:14 PM, robert.crock...@lamresearch.com wrote:
> > > 
> > > Can I easily get the MPICH config PETSc uses? I’m poking through the repo 
> > > and not seeing anything related to config of downloaded packages.
> > > Thanks,
> > > Robert
> > >  
> > > From: Barry Smith mailto:bsm...@petsc.dev>> 
> > > Sent: Friday, July 21, 2023 11:35 AM
> > > To: Crockett, Robert  > > <mailto:robert.crock...@lamresearch.com>>
> > > Cc: petsc-users@mcs.anl.gov <mailto:petsc-users@mcs.anl.gov>
> > > Subject: Re: [petsc-users] MPICH C++ compilers when using PETSC 
> > > --with-cxx=0
> > >  
> > > You don't often get email from bsm...@petsc.dev 
> > > <mailto:bsm...@petsc.dev>. Learn why this is important 
> > > <https://aka.ms/LearnAboutSenderIdentification>
> > >  
> > > External Email: Do NOT reply, click on links, or open attachments unless 
> > > you recognize the sender and know the content is safe. If you believe 
> > > this email may be unsafe, please click on the “Report Phishing” button on 
> > > the top right of Outlook.
> > > 
> > >  
> > > 
> > >  
> > >   No, you will need to build MPICH yourself, stand-alone and then direct 
> > > PETSc's configure to use what you have built.
> > >  
> > >   Barry
> > >  
> > > 
> > > 
> > > On Jul 21, 2023, at 11:11 AM, Robert Crockett via petsc-users 
> > > mailto:petsc-users@mcs.anl.gov>> wrote:
> > >  
> > > Hello,
> > > I built PETSc with –with-cxx=0 in order to get around a likely Intel C++ 
> > > compiler bug.
> > > However, the MPICH that also gets built by PETSc then picks up the wrong 
> > > C++ compiler; mpicxx -show indicates that it is using G++, while mpicc is 
> > > correctly using icc.
> > >  
> > > Is there a way to get PETSc to pass the correct C++ compiler for the 
> > > MPICH build when using –with-cxx=0? I need to compile parts of my own 
> > > program with mpicxx/icpc.
> > > Robert Crockett 
> > > Plasma Simulation Engineer | OCTO - Computational Products
> > > P: 617.648.8349  M: 415.205.4567
> > > 
> > > LAM RESEARCH
> > > 4650 Cushing Pkwy, Fremont CA 94538 USA 
> > > lamresearch.com <https://www.lamresearch.com/>
> > > 
> > >  
> > > 
> > > LAM RESEARCH CONFIDENTIALITY NOTICE: This e-mail transmission, and any 
> > > documents, files, or previous e-mail messages attached to it, 
> > > (collectively, "E-mail Transmission") may be subject to one or more of 
> > > the following based on the associated sensitivity level: E-mail 
> > > Transmission (i) contains confidential information, (ii) is prohibited 
> > > from distribution outside of Lam, and/or (iii) is intended solely for and 
> > > restricted to the specified recipient(s). If you are not the intended 
> > > recipient, or a person responsible for delivering it to the intended 
> > > recipient, you are hereby notified that any disclosure, copying, 
> > > distribution or use of any of the information contained in or attached to 
> > > this message is STRICTLY PROHIBITED. If you have received this 
> > > transmission in error, please immediately notify the sender and destroy 
> > > the original transmission and its attachments without reading them or 
> > > saving them to disk. Thank you.
> > > 
> > > 
> > > Confidential – Limited Access and Use
> > >  
> > > 
> > > Confidential – Limited Access and Use
> > > 
> > 
> > 
> 


Re: [petsc-users] MPICH C++ compilers when using PETSC --with-cxx=0

2023-07-21 Thread Satish Balay via petsc-users
Were you able to try Jacob's fix - so you could build with cxx?

Wrt building external pkgs - one way:

- first build pkgs:
./configure PETSC_ARCH=arch-pkgs --prefix=$HOME/soft/petsc-pkgs --with-cc=icc 
--with-cxx=icpc --with-fc=ifort --download-mpich --download-suitesparse

- now build PETSc with these pkgs
./configure PETSC_ARCH=arch-mybuild --with-mpi-dir=$HOME/soft/petsc-pkgs 
--with-cxx=0

[can't use --with-suitesparse-dir here - due to cxx dependencies within 
configure - but you disable this dependency check in configure - by commenting 
out one line of code]

Wrt configure options used - they are listed in configure.log - also in - for 
ex:

[balay@pj01 petsc]$ ls arch-linux-c-debug/externalpackages/*/*petsc*
arch-linux-c-debug/externalpackages/git.sowing/sowing.petscconf

Satish

On Fri, 21 Jul 2023, Barry Smith wrote:

> 
>   You need to look in the configure.log you to see the exact configure/cmake 
> command PETSc configure is using for each package it builds specific to that 
> run of PETSc configure. We do not save them in some other place.
> 
> 
> 
> > On Jul 21, 2023, at 12:14 PM, robert.crock...@lamresearch.com wrote:
> > 
> > Can I easily get the MPICH config PETSc uses? I’m poking through the repo 
> > and not seeing anything related to config of downloaded packages.
> > Thanks,
> > Robert
> >  
> > From: Barry Smith mailto:bsm...@petsc.dev>> 
> > Sent: Friday, July 21, 2023 11:35 AM
> > To: Crockett, Robert  > >
> > Cc: petsc-users@mcs.anl.gov 
> > Subject: Re: [petsc-users] MPICH C++ compilers when using PETSC --with-cxx=0
> >  
> > You don't often get email from bsm...@petsc.dev . 
> > Learn why this is important  
> >  
> >  
> > External Email: Do NOT reply, click on links, or open attachments unless 
> > you recognize the sender and know the content is safe. If you believe this 
> > email may be unsafe, please click on the “Report Phishing” button on the 
> > top right of Outlook.
> > 
> >  
> > 
> >  
> >   No, you will need to build MPICH yourself, stand-alone and then direct 
> > PETSc's configure to use what you have built.
> >  
> >   Barry
> >  
> > 
> > 
> > On Jul 21, 2023, at 11:11 AM, Robert Crockett via petsc-users 
> > mailto:petsc-users@mcs.anl.gov>> wrote:
> >  
> > Hello,
> > I built PETSc with –with-cxx=0 in order to get around a likely Intel C++ 
> > compiler bug.
> > However, the MPICH that also gets built by PETSc then picks up the wrong 
> > C++ compiler; mpicxx -show indicates that it is using G++, while mpicc is 
> > correctly using icc.
> >  
> > Is there a way to get PETSc to pass the correct C++ compiler for the MPICH 
> > build when using –with-cxx=0? I need to compile parts of my own program 
> > with mpicxx/icpc.
> > Robert Crockett 
> > Plasma Simulation Engineer | OCTO - Computational Products
> > P: 617.648.8349  M: 415.205.4567
> > 
> > LAM RESEARCH
> > 4650 Cushing Pkwy, Fremont CA 94538 USA 
> > lamresearch.com 
> > 
> >  
> > 
> > LAM RESEARCH CONFIDENTIALITY NOTICE: This e-mail transmission, and any 
> > documents, files, or previous e-mail messages attached to it, 
> > (collectively, "E-mail Transmission") may be subject to one or more of the 
> > following based on the associated sensitivity level: E-mail Transmission 
> > (i) contains confidential information, (ii) is prohibited from distribution 
> > outside of Lam, and/or (iii) is intended solely for and restricted to the 
> > specified recipient(s). If you are not the intended recipient, or a person 
> > responsible for delivering it to the intended recipient, you are hereby 
> > notified that any disclosure, copying, distribution or use of any of the 
> > information contained in or attached to this message is STRICTLY 
> > PROHIBITED. If you have received this transmission in error, please 
> > immediately notify the sender and destroy the original transmission and its 
> > attachments without reading them or saving them to disk. Thank you.
> > 
> > 
> > Confidential – Limited Access and Use
> >  
> > 
> > Confidential – Limited Access and Use
> > 
> 
> 


Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-20 Thread Satish Balay via petsc-users
quot;nostdlib", etc, options only work for linux versions of
> the compiler,
> so I'm not sure what to do about the first warning. From the errors, it
> looks like some core c or c++ library that hypre depends on isn't visible.
> I had some similar
> issues with ptscotch - but in that case I didn't have the warnings, and the
> errors gave me the names of libraries that were missing, which I lilnked in
> using the --cflags
> option (maybe --cc-linker-flags would have been neater, but it worked. I've
> tried both in order to try to get the above working).
> 
> 
> I can go into detail about the compile and linker commands if needed; I'd
> have to explain more about my choices for --cflags, etc too. I wonder if
> any of the above output
> shines any light on the hypre-is-shared-library hypothesis.
> 
> 
> Thanks,
> 
> Daniel
> 
> On Wed, Jul 19, 2023 at 4:58 PM Satish Balay via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
> 
> > I think it should work with static libraries and 64bit compilers.
> >
> > That's how I think --download-f2cblaslapack [etc] work.
> >
> > Also it works with MS-MPI - even-though its a dll install, the library
> > stub provides this symbol somehow..
> >
> > balay@ps5 /cygdrive/c/Program Files (x86)/Microsoft SDKs/MPI/Lib/x64
> > $ nm -Ao msmpi.lib |grep " MPI_Init"
> > msmpi.lib:msmpi.dll: T MPI_Init
> > msmpi.lib:msmpi.dll: T MPI_Init_thread
> > msmpi.lib:msmpi.dll: T MPI_Initialized
> >
> > However - if the library symbol is somehow mangled - this configure mode
> > of checking library functions will fail.
> >
> > Checking PETSc dll build:
> >
> > balay@ps5 ~/petsc/arch-ci-mswin-uni/lib
> > $ nm -Ao libpetsc.lib |grep MatCreateSeqAIJWithArrays
> > libpetsc.lib:libpetsc.dll: I
> > __imp_MatCreateSeqAIJWithArrays
> > libpetsc.lib:libpetsc.dll: T MatCreateSeqAIJWithArrays
> >
> > It also has the unmangled symbol - so I guess this mode can work generally
> > with dlls.
> >
> > Satish
> >
> >
> > On Wed, 19 Jul 2023, Barry Smith wrote:
> >
> > >
> > >   Satish,
> > >
> > >So it will always fail on Windows with Windows compilers (both with
> > static and shared libraries)? Is this true for all PETSc external packages?
> > If so, why does the installation documentation say that some external
> > packages can work with Windows compilers? (Presumably PETSc cannot since
> > the configure tests will fail).
> > >
> > >   Barry
> > >
> > >
> > > > On Jul 19, 2023, at 11:40 AM, Satish Balay  wrote:
> > > >
> > > > BTW: Some explanation of configure:
> > > >
> > > > It attempts the following on linux:
> > > >
> > > >>>>>>>
> > > > Source:
> > > > #include "confdefs.h"
> > > > #include "conffix.h"
> > > > /* Override any gcc2 internal prototype to avoid an error. */
> > > > char HYPRE_IJMatrixCreate();
> > > > static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }
> > > >
> > > > int main(void) {
> > > > _check_HYPRE_IJMatrixCreate();
> > > >  return 0;
> > > > }
> > > > <<<<<<<
> > > >
> > > > Note - it does not include 'HYPRE.h' here - but redefines the
> > prototype as 'char HYPRE_IJMatrixCreate();
> > > >
> > > > Compiling it manually:
> > > >
> > > >>>>>
> > > > [balay@pj01 petsc]$ cat conftest.c
> > > > char HYPRE_IJMatrixCreate();
> > > > static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }
> > > >
> > > > int main(void) {
> > > > _check_HYPRE_IJMatrixCreate();
> > > >  return 0;
> > > > }
> > > > [balay@pj01 petsc]$ gcc -c conftest.c
> > > > [balay@pj01 petsc]$ nm -Ao conftest.o |grep HYPRE_IJMatrixCreate
> > > > conftest.o: t _check_HYPRE_IJMatrixCreate
> > > > conftest.o: U HYPRE_IJMatrixCreate
> > > > [balay@pj01 petsc]$ nm -Ao arch-linux-c-debug/lib/libHYPRE.so |grep
> > HYPRE_IJMatrixCreate
> > > > arch-linux-c-debug/lib/libHYPRE.so:0007f2c2 T
> > HYPRE_IJMatrixCreate
> > > > [balay@pj01 petsc]$
> > > > <<<<
> > > >
> > > > Here the "U HYPRE_IJMat

Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-20 Thread Satish Balay via petsc-users
Can check config/BuildSystem/config/packages/hypre.py

petsc-3.19 (or release branch) is compatible with hypre 2.28.0, petsc 'main' 
branch with 2.29.0

Satish

On Thu, 20 Jul 2023, Barry Smith via petsc-users wrote:

> 
>   You cannot use this version of PETSc, 3.19, with the version of hypre you 
> installed. In hypre they recently changed hypre_Error from an integer to a 
> struct which completely breaks compatibility with previous versions of hypre 
> (and hence previous versions of PETSc). You must use the main git branch of 
> PETSc with the version of hypre you installed.
> 
>   Barry
> 
> 
> > On Jul 20, 2023, at 5:10 AM, Daniel Stone  
> > wrote:
> > 
> > Hi All,
> > 
> > Many thanks for the detailed explainations and ideas!
> > 
> > I tried skipping the test. When it came time to do the build itself (make 
> > $PETSC_DIR... all) I get some failures, unsurprisingly:
> > 
> > 
> > 
> >  FC arch-mswin-c-opt/obj/dm/f90-mod/petscdmplexmod.o
> >  CC arch-mswin-c-opt/obj/ksp/pc/impls/hypre/ftn-custom/zhypref.o
> >  CC arch-mswin-c-opt/obj/ksp/pc/impls/hypre/ftn-auto/hypref.o
> >  CC arch-mswin-c-opt/obj/ksp/pc/impls/hypre/hypre.o
> > C:\cygwin64\home\DANIEL~1\PETSC_~1.1\src\ksp\pc\impls\hypre\hypre.c(444,29):
> >  error: assigning to 'hypre_Error' from incompatible type 'int'
> > hypre__global_error = 0;
> > ^ ~
> > C:\cygwin64\home\DANIEL~1\PETSC_~1.1\include\petscerror.h(1752,7): note: 
> > expanded from macro 'PetscStackCallExternalVoid'
> >   __VA_ARGS__; \
> >   ^~~
> > C:\cygwin64\home\DANIEL~1\PETSC_~1.1\src\ksp\pc\impls\hypre\hypre.c(634,29):
> >  error: assigning to 'hypre_Error' from incompatible type 'int'
> > hypre__global_error = 0;
> > ^ ~
> > C:\cygwin64\home\DANIEL~1\PETSC_~1.1\include\petscerror.h(1752,7): note: 
> > expanded from macro 'PetscStackCallExternalVoid'
> >   __VA_ARGS__; \
> >   ^~~
> > 2 errors generated.
> > make[3]: *** [gmakefile:195: 
> > arch-mswin-c-opt/obj/ksp/pc/impls/hypre/hypre.o] Error 1
> > make[3]: *** Waiting for unfinished jobs
> >  FC arch-mswin-c-opt/obj/ksp/f90-mod/petsckspdefmod.o
> >  CC arch-mswin-c-opt/obj/dm/impls/da/hypre/mhyp.o
> >  CC arch-mswin-c-opt/obj/mat/impls/hypre/mhypre.o
> > make[3]: Leaving directory '/home/DanielOGS/petsc_ogs_3.19.1'
> > make[2]: *** [/home/DanielOGS/petsc_ogs_3.19.1/lib/petsc/conf/rules.doc:28: 
> > libs] Error 2
> > make[2]: Leaving directory '/home/DanielOGS/petsc_ogs_3.19.1'
> > **ERROR*
> >   Error during compile, check arch-mswin-c-opt/lib/petsc/conf/make.log
> >   Send it and arch-mswin-c-opt/lib/petsc/conf/configure.log to 
> > petsc-ma...@mcs.anl.gov <mailto:petsc-ma...@mcs.anl.gov>
> > 
> > Finishing make run at Wed, 19 Jul 2023 17:07:00 +0100
> > 
> > 
> > 
> > But wait - isn't this the compile stage, not the linking stage? This seems 
> > to imply that I've made a hash of providing include file such that a 
> > definition of "hypre_Error" 
> > cannot be seen - unless I'm misinterpreting. Interesting note about Hypre 
> > and include files - if built using configure and make, all the include 
> > files are conviniently copied
> > into hypre/src/hypre/include. This is not done for a cmake build - I had to 
> > do the copying myself. Maybe I missed one.
> > 
> > 
> > On shared vs. static - if there a clear way of telling which I've ended up 
> > with? I've checked the cmakelists for hypre and this seems to imply that 
> > not-shared is the default,
> > which I didn't change:
> > 
> > # Configuration options
> > option(HYPRE_ENABLE_SHARED   "Build a shared library" OFF)
> > option(HYPRE_ENABLE_BIGINT       "Use long long int for HYPRE_Int" OFF)
> > option(HYPRE_ENABLE_MIXEDINT "Use long long int for HYPRE_BigInt, 
> > int for HYPRE_INT" OFF)
> > []
> > 
> > 
> > checking again, I've noticed that the way that the stub-test fails is 
> > different depending on whether it's called from the config script or used 
> > in isolation - more details on that soon.
> > 
> > 
> > 
> > Thanks again,
> > 
> > Daniel
> > 
> > 
> > 
> >

Re: [petsc-users] Failing PETSc 3.19.2 compile using ICPC

2023-07-19 Thread Satish Balay via petsc-users
you can try --with-cxx-dialect=11 and see if that works.

with -with-cxx=0 - --download-suitesparse [and other pkgs that require cxx] 
won't work - and would need to be installed separately

Jacob,

One more use case for --with-cxx-bindings=0

Satish

On Wed, 19 Jul 2023, Barry Smith wrote:

> 
>   Do you need C++, you can configure --with-cxx=0 if you do not need it.
> 
>   You can also try the main branch of PETSc or slightly different versions of 
> the compiler.
> 
>   Barry
> 
> 
> 
> > On Jul 19, 2023, at 4:31 PM, Robert Crockett via petsc-users 
> >  wrote:
> > 
> > Hello,
> > I am attempting to build PETSc using the 2018.5.274 Intel compiler suite on 
> > CentOS7. I get the below error messages.
> > See the attached for more information. 
> > The file petscbuild.sh is the script used to configure and build
> > The file configure.log is output by PETSc
> > The file log.petscbuild is the output of the config & build script.
> >  
> > Can you please help find a work-around?
> > Best,
> > Robert
> >  
> > PS. Some related links I found in searching on this issue.
> > The first references a related compiler bug ticket opened with Intel, 
> > though I cannot tell if it was closed or if it applied to my compiler.
> > https://community.intel.com/t5/Intel-C-Compiler/Default-constructor-of-variant-is-deleted/m-p/1156212?profile.language=en
> > https://mediatum.ub.tum.de/doc/1555265/1555265.pdf
> >  
> > Robert Crockett 
> > Plasma Simulation Engineer | OCTO - Computational Products
> > P: 617.648.8349  M: 415.205.4567
> > 
> > LAM RESEARCH
> > 4650 Cushing Pkwy, Fremont CA 94538 USA 
> > lamresearch.com 
> > 
> > 
> > -
> > Using C compile: /usr/local/petsc/r/bin/mpicc -o .o -c -wd1572 
> > -Wno-unknown-pragmas -O3 -mtune=generic
> > mpicc -show: icc -fPIC -wd1572 -O3 -mtune=generic 
> > -I/usr/local/petsc/r/include -L/usr/local/petsc/r/lib -Wl,-rpath 
> > -Wl,/usr/local/petsc/r/lib -Wl,--enable-new-dtags -lmpi
> > C compiler version: icc (ICC) 18.0.5 20180823
> > Using C++ compile: /usr/local/petsc/r/bin/mpicxx -o .o -c -wd1572 -O3 
> > -mtune=generic  -std=c++14  -I/usr/local/share/petsc/src/include 
> > -I/usr/local/share/petsc/src/arch-linux-c-opt/include 
> > -I/usr/local/petsc/r/include
> > mpicxx -show: icpc -wd1572 -O3 -mtune=generic -std=c++14 -fPIC 
> > -I/usr/local/petsc/r/include -L/usr/local/petsc/r/lib -lmpicxx -Wl,-rpath 
> > -Wl,/usr/local/petsc/r/lib -Wl,--enable-new-dtags -lmpi
> > C++ compiler version: icpc (ICC) 18.0.5 20180823
> > Using Fortran compile: /usr/local/petsc/r/bin/mpif90 -o .o -c -Wall 
> > -ffree-line-length-none -ffree-line-length-0 -Wno-lto-type-mismatch 
> > -Wno-unused-dummy-argument -O3 -mtune=generic   
> > -I/usr/local/share/petsc/src/include 
> > -I/usr/local/share/petsc/src/arch-linux-c-opt/include 
> > -I/usr/local/petsc/r/include
> > mpif90 -show: gfortran -fPIC -ffree-line-length-none -ffree-line-length-0 
> > -Wno-lto-type-mismatch -O3 -mtune=generic -I/usr/local/petsc/r/include 
> > -I/usr/local/petsc/r/include -L/usr/local/petsc/r/lib -lmpifort -Wl,-rpath 
> > -Wl,/usr/local/petsc/r/lib -Wl,--enable-new-dtags -lmpi
> > Fortran compiler version: GNU Fortran (GCC) 7.3.1 20180303 (Red Hat 7.3.1-5)
> > -
> > Using C/C++ linker: /usr/local/petsc/r/bin/mpicc
> > Using C/C++ flags: -wd1572 -Wno-unknown-pragmas -O3 -mtune=generic
> > Using Fortran linker: /usr/local/petsc/r/bin/mpif90
> > Using Fortran flags: -Wall -ffree-line-length-none -ffree-line-length-0 
> > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -O3 -mtune=generic
> > -
> > Using system modules:
> > Using mpi.h: # 1 "/usr/local/petsc/r/include/mpi.h" 1
> > -
> > Using libraries: -Wl,-rpath,/usr/local/share/petsc/src/arch-linux-c-opt/lib 
> > -L/usr/local/share/petsc/src/arch-linux-c-opt/lib 
> > -Wl,-rpath,/usr/local/petsc/r/lib -L/usr/local/petsc/r/lib 
> > -Wl,-rpath,/opt/rh/devtoolset-7/root/usr/lib/gcc/x86_64-redhat-linux/7 
> > -L/opt/rh/devtoolset-7/root/usr/lib/gcc/x86_64-redhat-linux/7 
> > -Wl,-rpath,/opt/rh/devtoolset-7/root/usr/lib64 
> > -L/opt/rh/devtoolset-7/root/usr/lib64 
> > -Wl,-rpath,/work/tools/intel/composer_xe_2018_4/compilers_and_libraries_2018.5.274/linux/ipp/lib/intel64
> >  
> > -L/work/tools/intel/composer_xe_2018_4/compilers_and_libraries_2018.5.274/linux/ipp/lib/intel64
> >  
> > -Wl,-rpath,/work/tools/intel/composer_xe_2018_4/compilers_and_libraries_2018.5.274/linux/compiler/lib/intel64_lin
> >  
> > -L/work/tools/intel/composer_xe_2018_4/compilers_and_libraries_2018.5.274/linux/compiler/lib/intel64_lin
> >  
> > -Wl,-rpath,/work/tools/intel/composer_xe_2018_4/compilers_and_libraries_2018.5.274/linux/mkl/lib/intel64_lin
> >  -L/work/tools/intel/composer_xe_2018_4/compilers_and_l
 ibraries
 _2018.5.274/linux/mkl/lib/intel64_lin 

Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-19 Thread Satish Balay via petsc-users
I think it should work with static libraries and 64bit compilers.

That's how I think --download-f2cblaslapack [etc] work.

Also it works with MS-MPI - even-though its a dll install, the library stub 
provides this symbol somehow..

balay@ps5 /cygdrive/c/Program Files (x86)/Microsoft SDKs/MPI/Lib/x64
$ nm -Ao msmpi.lib |grep " MPI_Init"
msmpi.lib:msmpi.dll: T MPI_Init
msmpi.lib:msmpi.dll: T MPI_Init_thread
msmpi.lib:msmpi.dll: T MPI_Initialized

However - if the library symbol is somehow mangled - this configure mode of 
checking library functions will fail.

Checking PETSc dll build:

balay@ps5 ~/petsc/arch-ci-mswin-uni/lib
$ nm -Ao libpetsc.lib |grep MatCreateSeqAIJWithArrays
libpetsc.lib:libpetsc.dll: I __imp_MatCreateSeqAIJWithArrays
libpetsc.lib:libpetsc.dll: T MatCreateSeqAIJWithArrays

It also has the unmangled symbol - so I guess this mode can work generally with 
dlls.

Satish


On Wed, 19 Jul 2023, Barry Smith wrote:

> 
>   Satish,
> 
>So it will always fail on Windows with Windows compilers (both with static 
> and shared libraries)? Is this true for all PETSc external packages? If so, 
> why does the installation documentation say that some external packages can 
> work with Windows compilers? (Presumably PETSc cannot since the configure 
> tests will fail).
> 
>   Barry
> 
> 
> > On Jul 19, 2023, at 11:40 AM, Satish Balay  wrote:
> > 
> > BTW: Some explanation of configure:
> > 
> > It attempts the following on linux:
> > 
> >>>>>>> 
> > Source:
> > #include "confdefs.h"
> > #include "conffix.h"
> > /* Override any gcc2 internal prototype to avoid an error. */
> > char HYPRE_IJMatrixCreate();
> > static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }
> > 
> > int main(void) {
> > _check_HYPRE_IJMatrixCreate();
> >  return 0;
> > }
> > <<<<<<<
> > 
> > Note - it does not include 'HYPRE.h' here - but redefines the prototype as 
> > 'char HYPRE_IJMatrixCreate();
> > 
> > Compiling it manually:
> > 
> >>>>> 
> > [balay@pj01 petsc]$ cat conftest.c
> > char HYPRE_IJMatrixCreate();
> > static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }
> > 
> > int main(void) {
> > _check_HYPRE_IJMatrixCreate();
> >  return 0;
> > }
> > [balay@pj01 petsc]$ gcc -c conftest.c
> > [balay@pj01 petsc]$ nm -Ao conftest.o |grep HYPRE_IJMatrixCreate
> > conftest.o: t _check_HYPRE_IJMatrixCreate
> > conftest.o: U HYPRE_IJMatrixCreate
> > [balay@pj01 petsc]$ nm -Ao arch-linux-c-debug/lib/libHYPRE.so |grep 
> > HYPRE_IJMatrixCreate
> > arch-linux-c-debug/lib/libHYPRE.so:0007f2c2 T HYPRE_IJMatrixCreate
> > [balay@pj01 petsc]$ 
> > <<<<
> > 
> > Here the "U HYPRE_IJMatrixCreate" in conftest.o matches "T 
> > HYPRE_IJMatrixCreate" in libHYPRE.so - so the "link" test in configure 
> > succeeds!
> > 
> >>>>>>> 
> > [balay@pj01 petsc]$ gcc -o conftest conftest.o 
> > arch-linux-c-debug/lib/libHYPRE.so
> > [balay@pj01 petsc]$ echo $?
> > 0
> > <<<<<
> > 
> > On windows - [due to name mangling by cdecl/stdcall, (/MT vs /MD) etc..] - 
> > this might not match - resulting in link failures.
> > 
> > Satish
> > 
> > 
> > On Wed, 19 Jul 2023, Satish Balay via petsc-users wrote:
> > 
> >> You could try skipping this test [and assume --with-hypre-include and 
> >> --with-hypre-lib options are correct] - and see if this works.
> >> 
> >> diff --git a/config/BuildSystem/config/packages/hypre.py 
> >> b/config/BuildSystem/config/packages/hypre.py
> >> index 5bc88322aa2..2d6c7932e17 100644
> >> --- a/config/BuildSystem/config/packages/hypre.py
> >> +++ b/config/BuildSystem/config/packages/hypre.py
> >> @@ -11,7 +11,7 @@ class Configure(config.package.GNUPackage):
> >> self.requiresversion = 1
> >> self.gitcommit   = 'v'+self.version
> >> self.download= 
> >> ['git://https://github.com/hypre-space/hypre','https://github.com/hypre-space/hypre/archive/'+self.gitcommit+'.tar.gz']
> >> -self.functions   = ['HYPRE_IJMatrixCreate']
> >> +self.functions   = []
> >> self.includes= ['HYPRE.h']
> >> self.liblist = [['libHYPRE.a']]
> >> self.buildLanguages  = ['C','Cxx']
> >> 
> >> 
> >>

Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-19 Thread Satish Balay via petsc-users
BTW: Some explanation of configure:

It attempts the following on linux:

>>>>>>
Source:
#include "confdefs.h"
#include "conffix.h"
/* Override any gcc2 internal prototype to avoid an error. */
char HYPRE_IJMatrixCreate();
static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }

int main(void) {
_check_HYPRE_IJMatrixCreate();
  return 0;
}
<<<<<<<

Note - it does not include 'HYPRE.h' here - but redefines the prototype as 
'char HYPRE_IJMatrixCreate();

Compiling it manually:

>>>>
[balay@pj01 petsc]$ cat conftest.c
char HYPRE_IJMatrixCreate();
static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }

int main(void) {
_check_HYPRE_IJMatrixCreate();
  return 0;
}
[balay@pj01 petsc]$ gcc -c conftest.c
[balay@pj01 petsc]$ nm -Ao conftest.o |grep HYPRE_IJMatrixCreate
conftest.o: t _check_HYPRE_IJMatrixCreate
conftest.o: U HYPRE_IJMatrixCreate
[balay@pj01 petsc]$ nm -Ao arch-linux-c-debug/lib/libHYPRE.so |grep 
HYPRE_IJMatrixCreate
arch-linux-c-debug/lib/libHYPRE.so:0007f2c2 T HYPRE_IJMatrixCreate
[balay@pj01 petsc]$ 
<<<<

Here the "U HYPRE_IJMatrixCreate" in conftest.o matches "T 
HYPRE_IJMatrixCreate" in libHYPRE.so - so the "link" test in configure succeeds!

>>>>>>
[balay@pj01 petsc]$ gcc -o conftest conftest.o 
arch-linux-c-debug/lib/libHYPRE.so
[balay@pj01 petsc]$ echo $?
0
<<<<<

On windows - [due to name mangling by cdecl/stdcall, (/MT vs /MD) etc..] - this 
might not match - resulting in link failures.

Satish


On Wed, 19 Jul 2023, Satish Balay via petsc-users wrote:

> You could try skipping this test [and assume --with-hypre-include and 
> --with-hypre-lib options are correct] - and see if this works.
> 
> diff --git a/config/BuildSystem/config/packages/hypre.py 
> b/config/BuildSystem/config/packages/hypre.py
> index 5bc88322aa2..2d6c7932e17 100644
> --- a/config/BuildSystem/config/packages/hypre.py
> +++ b/config/BuildSystem/config/packages/hypre.py
> @@ -11,7 +11,7 @@ class Configure(config.package.GNUPackage):
>  self.requiresversion = 1
>  self.gitcommit   = 'v'+self.version
>  self.download= 
> ['git://https://github.com/hypre-space/hypre','https://github.com/hypre-space/hypre/archive/'+self.gitcommit+'.tar.gz']
> -self.functions   = ['HYPRE_IJMatrixCreate']
> +self.functions   = []
>  self.includes= ['HYPRE.h']
>  self.liblist = [['libHYPRE.a']]
>  self.buildLanguages  = ['C','Cxx']
> 
> 
> Satish
> 
> 
> On Wed, 19 Jul 2023, Barry Smith wrote:
> 
> > 
> >   You don't indicate what type of libraries you built hypre with; static or 
> > shared. My guess is you ended up with shared
> > 
> >   I think the answer to your difficulty is hidden in __cdecl (Satish will 
> > know much better than me). When you are looking for symbols in Windows 
> > shared libraries you have to prepend something to the function prototype to 
> > have it successfully found. For example the PETSc include files have these 
> > things __declspec(dllimport) The configure test fails because it does not 
> > provide the needed prototype. Likely you built PTScotch with static 
> > libraries so no problem.
> > 
> >   The simplest fix would be to build static hypre libraries. I think it is 
> > a major project to get PETSc configure and macro system to work properly 
> > with external packages that are in Windows shared libraries since more use 
> > of __declspec would be needed.
> > 
> >   Barry
> > 
> >   The PETSc installation instructions should probably say something about 
> > external packages with Windows shared libraries.
> > 
> > 
> >   
> > 
> > 
> > 
> > 
> > > On Jul 19, 2023, at 10:52 AM, Daniel Stone  
> > > wrote:
> > > 
> > > Hello,
> > > 
> > > I'm working on getting a petsc build running on windows. One necessary 
> > > package to include is Hypre. I've been able to build Hypre seperately 
> > > using cmake, and confirmed that the library works
> > > by setting up a VS project to run some of the example programs.
> > > 
> > > My attempted petsc build is being done through cygwin. I've been able to 
> > > (with varying degrees of difficulty), build a fairly plain petsc, and one 
> > > that downloads and builds ptscotch (after some modifications
> > > to both ptscotch and the config script). I am now attempting to include 
> > > Hypre (using the --hypre-iclude and --hypre-lib flags, etc). Note that 
> > > the same compilers are being used for bot

Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-19 Thread Satish Balay via petsc-users
You could try skipping this test [and assume --with-hypre-include and 
--with-hypre-lib options are correct] - and see if this works.

diff --git a/config/BuildSystem/config/packages/hypre.py 
b/config/BuildSystem/config/packages/hypre.py
index 5bc88322aa2..2d6c7932e17 100644
--- a/config/BuildSystem/config/packages/hypre.py
+++ b/config/BuildSystem/config/packages/hypre.py
@@ -11,7 +11,7 @@ class Configure(config.package.GNUPackage):
 self.requiresversion = 1
 self.gitcommit   = 'v'+self.version
 self.download= 
['git://https://github.com/hypre-space/hypre','https://github.com/hypre-space/hypre/archive/'+self.gitcommit+'.tar.gz']
-self.functions   = ['HYPRE_IJMatrixCreate']
+self.functions   = []
 self.includes= ['HYPRE.h']
 self.liblist = [['libHYPRE.a']]
 self.buildLanguages  = ['C','Cxx']


Satish


On Wed, 19 Jul 2023, Barry Smith wrote:

> 
>   You don't indicate what type of libraries you built hypre with; static or 
> shared. My guess is you ended up with shared
> 
>   I think the answer to your difficulty is hidden in __cdecl (Satish will 
> know much better than me). When you are looking for symbols in Windows shared 
> libraries you have to prepend something to the function prototype to have it 
> successfully found. For example the PETSc include files have these things 
> __declspec(dllimport) The configure test fails because it does not provide 
> the needed prototype. Likely you built PTScotch with static libraries so no 
> problem.
> 
>   The simplest fix would be to build static hypre libraries. I think it is a 
> major project to get PETSc configure and macro system to work properly with 
> external packages that are in Windows shared libraries since more use of 
> __declspec would be needed.
> 
>   Barry
> 
>   The PETSc installation instructions should probably say something about 
> external packages with Windows shared libraries.
> 
> 
>   
> 
> 
> 
> 
> > On Jul 19, 2023, at 10:52 AM, Daniel Stone  
> > wrote:
> > 
> > Hello,
> > 
> > I'm working on getting a petsc build running on windows. One necessary 
> > package to include is Hypre. I've been able to build Hypre seperately using 
> > cmake, and confirmed that the library works
> > by setting up a VS project to run some of the example programs.
> > 
> > My attempted petsc build is being done through cygwin. I've been able to 
> > (with varying degrees of difficulty), build a fairly plain petsc, and one 
> > that downloads and builds ptscotch (after some modifications
> > to both ptscotch and the config script). I am now attempting to include 
> > Hypre (using the --hypre-iclude and --hypre-lib flags, etc). Note that the 
> > same compilers are being used for both Hypre and for petsc
> > through cygwin - the new intel oneapi compilers (icx and ifx, after again 
> > varying amounts of pain to work around their awkwardness with the config 
> > script).
> > 
> > I'm seeing a problem when the config script does some tests on the included 
> > hypre lib. The source code looks like:
> > 
> > #include "confdefs.h"
> > #include "conffix.h"
> > /* Override any gcc2 internal prototype to avoid an error. */
> > 
> > #include "HYPRE.h"
> > 
> > char HYPRE_IJMatrixCreate();
> > static void _check_HYPRE_IJMatrixCreate() { HYPRE_IJMatrixCreate(); }
> > 
> > int main() {
> > _check_HYPRE_IJMatrixCreate();;
> >   return 0;
> > }
> > 
> > 
> > As I understand this is a fairly standard type of stub program used by the 
> > config script to check that it is able to link to certain symbols in given 
> > libraries. Tests like this have succeeded in my builds that
> > include PTScotch.
> > 
> > I keep getting a linker error with the above test, including if I seperate 
> > it out and try to build it seperately:
> > 
> > unresolved external symbol "char __cdel HYPRE_IJMatrixCreate(void)" 
> > 
> > Ok, it looks like a problem with either the library or linker commands. But 
> > here's the interesting thing - If I transplant this code into VS, with the 
> > same project setting that allows it to build the much more 
> > nontrivial Hypre example programs, I get the same error:
> > 
> > Error LNK2001 unresolved external symbol "char __cdecl 
> > HYPRE_IJMatrixCreate(void)" (?HYPRE_IJMatrixCreate@@YADXZ) hypretry1 
> > C:\Users\DanielOGS\source\repos\hypretry1\hypretry1\Source.obj 1
> > 
> > So it seems like there might be something about this type of stub program 
> > that is not working with my Hypre library. I don't fully understand this 
> > program - it's able to call the function with no arguments, but
> > it also needs to be linked against a library containing the function, 
> > apparently by wrapping it in a static void function? Not something I've 
> > seen before. 
> > 
> > Does anyone have any insight into what might be going wrong - or really 
> > just any explaination of how the stub program works so I can figure out why 
> > it isn't in this case?
> > 
> > Many thanks,
> > 

Re: [petsc-users] PETSc Installation Assistance

2023-07-17 Thread Satish Balay via petsc-users
On Mon, 17 Jul 2023, Pierre Jolivet wrote:

> https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean

> > On 17 Jul 2023, at 7:51 PM, Ferrand, Jesus A.  wrote:

> > hwloc/linux: Ignoring PCI device with non-16bit domain. 
> > Pass --enable-32bits-pci-domain to configure to support such devices 
> > (warning: it would break the library ABI, don't enable unless really 
> > needed). 



> > Ran ./configure with --download-mpich-device=ch3:nemesis --download-mpich
> > 
> > Surprisingly, I can compile my PETSc programs using a makefile, and run 
> > them. 
> > However, the programs are now broken (I get SIGSEGV originating from within 
> > DM/DMPlex APIs).

This must be a different issue - might need to run in debugger or valgrind to 
check this failure.

Satish


Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread Satish Balay via petsc-users
PETSc supports both --prefix and inplace install.

Suggest using inplace install - in your $HOME:

For ex:

>>>
balay@p1 /home/balay
$ tar -xzf petsc-3.19.
petsc-3.19.2.tar.gz  petsc-3.19.3.tar.gz  
balay@p1 /home/balay
$ tar -xzf petsc-3.19.3.tar.gz 
balay@p1 /home/balay
$ cd petsc-3.19.3
balay@p1 /home/balay/petsc-3.19.3
$ export PETSC_DIR=$PWD
balay@p1 /home/balay/petsc-3.19.3
$ export PETSC_ARCH=arch-buildfordolfin
balay@p1 /home/balay/petsc-3.19.3
$ ./configure && make
<<<

[with the required configure options to get a successful build of PETSc]

Now - with PETSC_DIR and PETSC_ARCH (env variables) - set to the above values - 
try installing dolfin


Satish


On Mon, 17 Jul 2023, Barry Smith wrote:

> 
>   When configuring and making PETSc, PETSC_DIR can be empty or point to the 
> directory with the PETSc source
> 
>   If you used --prefix to configure and install PETSc then 
> 
>   When using PETSc to compile other source code and using a makefile that 
> utilizes PETSC_DIR, then PETSC_DIR needs to point to the --prefix location
> 
>   PETSC_ARCH should be empty since it is not used for --prefix installs
> 
>
> 
> > On Jul 17, 2023, at 2:06 PM, philliprusso via petsc-users 
> >  wrote:
> > 
> > So I have this to go by:
> > export PETSC_DIR=/absolute/path/to/petsc
> > 
> > export PETSC_ARCH=linux-gnu-c-debug
> > 
> > PETSC_DIR Is that the installation/destination path or the path where the 
> > source code is? I feel confused
> > 
> > So there is a destination folder that dolfinx project wnats for petsc. I 
> > have an idea it might want /usr/local/include not sure of everything quite 
> > yet...
> > 
> > 
> > 
> > 
> > Sent with Proton Mail secure email.
> > 
> > --- Original Message ---
> > On Monday, July 17th, 2023 at 8:08 AM, Satish Balay  
> > wrote:
> > 
> > 
> >> We do not recommend installing PETSc in /usr/include [aka --prefix=/usr]. 
> >> Now you have some petsc includes here that will potentially conflict with 
> >> any other install of PETSc you might attempt.
> >> 
> >> You might have to manually check - and delete PETSc installed files from 
> >> here.
> >> 
> >> For most uses - one can install PETSc in $HOME
> >> 
> >> If you need a common install - suggest --prefix=/usr/local/petsc-version
> >> 
> >> Satish
> >> 
> >> On Sun, 16 Jul 2023, philliprusso via petsc-users wrote:
> >> 
> >>> I cloned petsc source for use with dolfinx project. So after .configure 
> >>> mak sudo make install I found there was some type of difficulty with the 
> >>> destination directory so I copied the files manually into usr/includes of 
> >>> Ubuntu 22.04 jammy. So some petsc header files are now found for 
> >>> compiling cpp dolfinx source code but petscconf.h still not found by 
> >>> dolfinxs source tree of header files. Anyone know how to remedy this so 
> >>> dolfinx cpps can find petsscconf.h that g++ is claiming as missing? Thank 
> >>> you!
> >>> 
> >>> Sent with Proton Mail secure email.
> 



Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread Satish Balay via petsc-users
We do not recommend installing PETSc in /usr/include [aka --prefix=/usr].  Now 
you have some petsc includes here that will potentially conflict with any other 
install of PETSc you might attempt.

You might have to manually check - and delete PETSc installed files from here.

For most uses - one can install PETSc in $HOME

If you need a common install - suggest --prefix=/usr/local/petsc-version

Satish

On Sun, 16 Jul 2023, philliprusso via petsc-users wrote:

> I cloned petsc source for use with dolfinx project. So after .configure mak 
> sudo make install I found there was some type of difficulty with the 
> destination directory so I copied the files manually into usr/includes of 
> Ubuntu 22.04 jammy. So some petsc header files are now found for compiling 
> cpp dolfinx source code but petscconf.h still not found by dolfinxs source 
> tree of header files. Anyone know how to remedy this so dolfinx cpps can find 
> petsscconf.h that g++ is claiming as missing? Thank you!
> 
> Sent with [Proton Mail](https://proton.me/) secure email.



Re: [petsc-users] windows build

2023-07-11 Thread Satish Balay via petsc-users
On Tue, 11 Jul 2023, Константин via petsc-users wrote:

> 
> Hello, I'm trying to build petsc on windows. And when I make it I have such 
> problem

from the screenshot - it appears you are not using cygwin terminal(bash-shell) 
- as per instructions

https://petsc.org/release/install/windows/

What compilers are you attempting to use with PETSc? Can you retry your build - 
as per the above instructions?

If you still encounter issues - send us the corresponding configure.log.

Also - best if you can copy/paste text from terminal - instead of screenshots

Note: if you can use WSL2(linux) on your windows machine - that might be an 
easier install

Satish

Re: [petsc-users] petsc configure problem

2023-06-21 Thread Satish Balay via petsc-users
>  --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpi PETSC_ARCH=intel_2020

> Executing: mpicc --version
> stdout:
> gcc (GCC) 7.3.1 20180303 (Red Hat 7.3.1-5)

Did you intend to use Intel compilers?

>
/tmp/petsc-sMhvLh/config.setCompilers/conftest.c: In function ‘main’:
/tmp/petsc-sMhvLh/config.setCompilers/conftest.c:8:9: error: ‘FLT_ROUNDS’ 
undeclared (first use in this function)
 y = FLT_ROUNDS;
 ^~
<

Perhaps the intel compiler setup in env is causing gcc to misbehave?

i.e if you are attempting an install with intel compilers - use 'mpiicc, 
mpiicpc, mpiifort'

if you are attempting an install with gcc - avoid the intel compiler setup in 
env

Also currently supported release in petsc-3.19 - we suggest using it instead of 
3.14

Satish



On Thu, 22 Jun 2023, Gael wrote:

> To whom it may concern,
> 
> 
> I'm trying to install PETSc on my HPC, and in the configure step something 
> keeps going wrong with the following message:
> 
> 
> PETSc requires c99 compiler! Configure could not determine compatible 
> compiler flag. Perhaps you can specify via CFLAGS
> 
> 
> We have googled it and try to add CFLAGS spefified in this postRe: 
> [petsc-users] PETSc on GCC (mail-archive.com). But none could work and the 
> error message changed as:
> 
> 
> C compiler you provided with -with-cc=mpicc cannot be found or does not work. 
> Cannot compile C with mpicc.
> 
> 
> 
> My log file is attached. Could you please give some suggestions on this 
> problem?
> 
> 
> Thank you,
> Jiahong CHEN


Re: [petsc-users] Symbol lookup errors after change of module system on cluster

2023-06-19 Thread Satish Balay via petsc-users
Are you able to run a simple MPI test code - say 
https://raw.githubusercontent.com/pmodels/mpich/main/examples/cpi.c with this 
compiler/mpi setup?

Also - do you get this error with a petsc example [using the corresponding 
petsc makefile?]

Satish

On Mon, 19 Jun 2023, Matthias Hesselmann wrote:

> Dear users,
> 
> since the operating system of the cluster I am running my application on with 
> PETSC (Version 3.19) has been changed from CentOS 7 to Rocky Linux 8 and also 
> the module system has been changed to Lmod, I get the following error message 
> when running my application:
> 
> Primary job  terminated normally, but 1 process returned
> a non-zero exit code. Per user-direction, the job has been aborted.
> mpiexec noticed that process rank 0 with PID 0 on node login18-x-2 exited on 
> signal 15 (Terminated).
> make: *** [Makefile:96: cathode-run] Error 143
> 
> When running the application with LD_DEBUG=files, it says that certain 
> symbols cannot be looked up by PETSC, e.g.:
> 
> /cvmfs/software.hpc.rwth.de/Linux/RH8/x86_64/intel/skylake_avx512/software/UCX/1.12.1-GCCcore-11.3.0/lib/ucx/libuct_cma.so.0:
>  error: symbol lookup error: undefined symbol: ucs_module_global_init (fatal)
> [...]
> /rwthfs/rz/cluster/home/mh787286/petsc/arch-linux-c-debug/lib/libpetsc.so.3.19:
>  error: symbol lookup error: undefined symbol: MPID_Abort (fatal)
> /rwthfs/rz/cluster/home/mh787286/petsc/arch-linux-c-debug/lib/libpetsc.so.3.19:
>  error: symbol lookup error: undefined symbol: ps_tool_initialize (fatal)
> 
> I attached the output in the "LD_DEBUG_make_cathode-run" file. When I look up 
> the dependencies of libpetsc.so.3.19 with the ldd  command, I can find the 
> locations of the dependent libraries listed in the LD_LIBRARY_PATH (see 
> configure.log). Thus, PETSC should be able to link to these libraries.
> 
> I load the modules GCC/11.3.0 and OpenMPI/4.1.4. Furthermore, please also 
> find the make file attached as "Makefile" as well as the configure.log and 
> make.log.
> 
> Is there anything I need to change in the make file to adapt it to the new 
> module system, or are there any issues with missing links/ libraries after 
> the update?
> 
> Kind regards,
> Matthias
> 
> 
> 



Re: [petsc-users] PETSc downloading older version of OpenBLAS

2023-06-09 Thread Satish Balay via petsc-users
Yeah  --download-openblas-commit option is a modifier to --download-openblas - 
so both options should be specified..

There is a proposal to automatically enable --download-openblas when 
--download-openblas-commit (or other similar modifier) option is specified - 
but its not in petsc yet..

Satish

On Fri, 9 Jun 2023, Kalle Karhapää (TAU) wrote:

> I managed to download a compatible openBLAS with conf
> 
> ./configure --download-openblas 
> --download-openblas-commit='0b678b19dc03f2a999d6e038814c4c50b9640a4e' 
> --with-openmp --with-mpi=0 --with-shared-libraries=1 --with-mumps-serial=1 
> --download-mumps --download-metis --download-slepc --with-debugging=0 
> --with-scalar-type=real --with-x=0 COPTFLAGS='-O3' CXXOPTFLAGS='-O3' 
> FOPTFLAGS='-O3';
> 
> --download-openblas-commit= didn't work by itself but with 
> -download-openblas it did
> 
> case closed!
> 
> -Kalle
> From: Kalle Karhapää (TAU)
> Sent: perjantai 9. kesäkuuta 2023 8.47
> To: petsc-users@mcs.anl.gov
> Subject: PETSc downloading older version of OpenBLAS
> 
> Hi all
> 
> 
> During install I'm checking out an older version of PETSc 
> (v3.17.1-512-g27c9ef7be8) but running into problems with -download-openblas 
> in configure.
> 
> I suspect the newest version of OpenBLAS that is being downloaded from git is 
> incompatible with this older version of petsc
> 
> 
> Is there a way to have petsc -download an older version of openblas (eg. 
> v0.3.20) ?
> 
> 
> Thanks
> 
> -Kalle
> 


Re: [petsc-users] Error in building PETSc

2023-05-19 Thread Satish Balay via petsc-users
Use "make OMAKE_PRINTDIR=gmake all" instead of "make all"

or use latest release

Satish

On Fri, 19 May 2023, Jau-Uei Chen wrote:

> To whom it may concern,
> 
> Currently, I am trying to build PETSc-3.17.4 on my own laptop (MacPro Late
> 2019) but encounter an error when performing "make all". Please see the
> attachment for my configuration and make.log.
> 
> Any comments or suggestions on how to resolve this error are greatly
> appreciated.
> 
> Best Regard,
> Jau-Uei Chen
> Graduate student
> Department of Aerospace Engineering and Engineering Mechanics
> The University of Texas at Austin
> 



Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Satish Balay via petsc-users
Ops - for some reason I assumed this build is on Mac M1. [likely due to the 
usage of '-m64' - that was strange]..

But yeah - our general usage on Mac is with xcode/clang and brew gfortran (on 
both Intel and ARM CPUs) - and unless you need Intel compilers for specific 
needs - clang/gfortran should work better for this development work.

Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hi Satish, well turns out this is not an M1 Mac, it is an older Intel Mac 
> (2019).
> I'm trying to get a local computer to do development and tests, but I also 
> have access to linux clusters with GPU which we plan to go to next.
> Thanks for the suggestion, I might also try compiling a gcc/gfortran version 
> of the lib on this computer.
> Marcos
> 
> From: Satish Balay 
> Sent: Monday, May 15, 2023 12:10 PM
> To: Vanella, Marcos (Fed) 
> Cc: petsc-users@mcs.anl.gov 
> Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
> OpenMPI
> 
> I see Intel compilers here are building x86_64 binaries - that get run on the 
> Arm M1 CPU - perhaps there are issues here with this mode of usage..
> 
> > I'm starting to work with PETSc. Our plan is to use the linear solver from 
> > PETSc for the Poisson equation on our numerical scheme and test this on a 
> > GPU cluster.
> 
> What does intel compilers provide you for this use case?
> 
> Why not use xcode/clang with gfortran here - i.e native ARM binaries?
> 
> 
> Satish
> 
> On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:
> 
> > Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> > 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> > Ventura 13.3.1.
> > I can compile PETSc in debug mode with this configure and make lines. I can 
> > run the PETSC tests, which seem fine.
> > When I compile the library in optimized mode, either using -O3 or O1, for 
> > example configuring with:
> >
> > $ ./configure --prefix=/opt/petsc-oneapi22u3 
> > --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> > -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> > FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> > --with-shared-libraries=0 --download-make
> >
> > and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> > compiles. Yet, I see right off the bat this segfault error in the first 
> > PETSc example:
> >
> > $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> > PETSC_ARCH=arch-darwin-c-opt test
> > /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> > --no-print-directory -f 
> > /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> > PETSC_ARCH=arch-darwin-c-opt 
> > PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> > /opt/intel/oneapi/intelpython/latest/bin/python3 
> > /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> > --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> > --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> > Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> > PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
> >  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> > In file included from 
> > /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
> >  from 
> > /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> > /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> > warning #2621: attribute "warn_unused_result" does not apply here
> >   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> > ^
> >
> > CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
> >TEST 
> > arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> > not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> > # [excess:98681] *** Process received signal ***
> > # [excess:98681] Signal: Segmentation fault: 11 (11)
> > # [excess:98681] Signal code: Address not mapped (1)
> > # [excess:98681] Failing at address: 0x7f
> > # [excess:98681] *** End of error message ***
> > # 
> > --
> > # Primary job  terminated normally, but 1 process returned
> > # a non-zero exit code. Per user-direction, the job has been aborted.
> > # 
> > --
> > # 
> > --
> > # mpiexec noticed that process rank 0 with PID 0 on node excess exited 
> > on signal 11 (Segmentation fault: 11).
> > # 
> > --
> >  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
> >
> > I see the same segfault error 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Satish Balay via petsc-users
I see Intel compilers here are building x86_64 binaries - that get run on the 
Arm M1 CPU - perhaps there are issues here with this mode of usage..

> I'm starting to work with PETSc. Our plan is to use the linear solver from 
> PETSc for the Poisson equation on our numerical scheme and test this on a GPU 
> cluster.

What does intel compilers provide you for this use case?

Why not use xcode/clang with gfortran here - i.e native ARM binaries?


Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> Ventura 13.3.1.
> I can compile PETSc in debug mode with this configure and make lines. I can 
> run the PETSC tests, which seem fine.
> When I compile the library in optimized mode, either using -O3 or O1, for 
> example configuring with:
> 
> $ ./configure --prefix=/opt/petsc-oneapi22u3 
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> --with-shared-libraries=0 --download-make
> 
> and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> compiles. Yet, I see right off the bat this segfault error in the first PETSc 
> example:
> 
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> PETSC_ARCH=arch-darwin-c-opt test
> /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> --no-print-directory -f 
> /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> /opt/intel/oneapi/intelpython/latest/bin/python3 
> /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
>  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> In file included from 
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>  from 
> /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> ^
> 
> CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
>TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> # [excess:98681] *** Process received signal ***
> # [excess:98681] Signal: Segmentation fault: 11 (11)
> # [excess:98681] Signal code: Address not mapped (1)
> # [excess:98681] Failing at address: 0x7f
> # [excess:98681] *** End of error message ***
> # 
> --
> # Primary job  terminated normally, but 1 process returned
> # a non-zero exit code. Per user-direction, the job has been aborted.
> # 
> --
> # 
> --
> # mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
> signal 11 (Segmentation fault: 11).
> # 
> --
>  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
> 
> I see the same segfault error in all PETSc examples.
> Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is 
> to use the linear solver from PETSc for the Poisson equation on our numerical 
> scheme and test this on a GPU cluster. So also, any guideline on how to 
> interface PETSc with a fortran code and personal experience is also most 
> appreciated!
> 
> Marcos
> 
> 
> 
> 


Re: [petsc-users] Fortran preprocessor not work in pets-dev

2023-05-07 Thread Satish Balay via petsc-users
Perhaps you are not using the latest 'main' (or release) branch?

I get (with current main):

$  mpiexec -n 4 ./petsc_fppflags
 compiled by STANDARD_FORTRAN compiler
 called by rank0
 called by rank1
 called by rank2
 called by rank3

There was a issue with early petsc-3.19 release - here one had to reorder the 
lines from:

FPPFLAGS =
include ${PETSC_DIR}/lib/petsc/conf/variables
include ${PETSC_DIR}/lib/petsc/conf/rules

to

include ${PETSC_DIR}/lib/petsc/conf/variables
include ${PETSC_DIR}/lib/petsc/conf/rules
FPPFLAGS =

But this is fixed in latest release and main branches.

Satish

On Sun, 7 May 2023, Danyang Su wrote:

> Hi Satish,
> 
> Sorry, this is a typo when copy to the email. I use FPPFLAGS in the makefile. 
> Not sure why this occurs.
> 
> Actually not only the preprocessor fails, the petsc initialize does not work 
> either. Attached is a very simple fortran code and below is the test results. 
> Looks like the petsc is not properly installed. I am working on macOS 
> Monterey version 12.5 (Intel Xeon W processor).
> 
> Compiled using petsc-3.18
> (base) ➜  petsc-dev-fppflags mpiexec -n 4 ./petsc_fppflags
>  compiled by STANDARD_FORTRAN compiler
>  called by rank0
>  called by rank1
>  called by rank2
>  called by rank3
> 
> compiled using petsc-dev
> (base) ➜  petsc-dev-fppflags mpiexec -n 4 ./petsc_fppflags
>  called by rank2
>  called by rank2
>  called by rank2
>  called by rank2
> 
> Thanks,
> 
> Danyang
> 
> On 2023-05-06, 10:22 PM, "Satish Balay"  > wrote:
> 
> 
> On Sat, 6 May 2023, Danyang Su wrote:
> 
> 
> > Hi All,
> > 
> > 
> > 
> > My code has some FPP. It works fine in PETSc 3.18 and earlier version, but 
> > stops working in the latest PETSc-Dev. For example the following FPP 
> > STANDARD_FORTRAN is not recognized. 
> > 
> > 
> > 
> > #ifdef STANDARD_FORTRAN
> > 
> > 1 format(15x,1000a15)
> > 
> > 2 format(1pe15.6e3,1000(1pe15.6e3))
> > 
> > #else
> > 
> > 1 format(15x,a15) 
> > 
> > 2 format(1pe15.6e3,(1pe15.6e3))
> > 
> > #endif
> > 
> > 
> > 
> > In the makefile, I define the preprocessor as PPFLAGS.
> > 
> > 
> > 
> > PPFLAGS := -DLINUX -DRELEASE -DRELEASE_X64 -DSTANDARD_FORTRAN
> 
> 
> Shouldn't this be FPPFLAGS?
> 
> 
> 
> 
> Can you send us a simple test case [with the makefile] that we can try to 
> demonstrate this problem?
> 
> 
> Satish
> 
> 
> > 
> > …
> > 
> > exe: $(OBJS) chkopts
> > 
> > -${FLINKER} $(FFLAGS) $(FPPFLAGS) $(CPPFLAGS) -o $(EXENAME) $(OBJS) 
> > ${PETSC_LIB} ${LIS_LIB} ${DLIB} ${SLIB}
> > 
> > 
> > 
> > Any idea on this problem?
> > 
> > 
> > 
> > All the best,
> > 
> > 
> > 
> > 
> > 
> > 
> 
> 
> 
> 


Re: [petsc-users] Fortran preprocessor not work in pets-dev

2023-05-06 Thread Satish Balay via petsc-users
On Sat, 6 May 2023, Danyang Su wrote:

> Hi All,
> 
>  
> 
> My code has some FPP. It works fine in PETSc 3.18 and earlier version, but 
> stops working in the latest PETSc-Dev. For example the following FPP 
> STANDARD_FORTRAN is not recognized. 
> 
>  
> 
> #ifdef STANDARD_FORTRAN
> 
>     1 format(15x,1000a15)
> 
>     2 format(1pe15.6e3,1000(1pe15.6e3))
> 
> #else
> 
>     1 format(15x,a15)    
> 
> 2 format(1pe15.6e3,(1pe15.6e3))
> 
> #endif
> 
>  
> 
> In the makefile, I define the preprocessor as PPFLAGS.
> 
>  
> 
> PPFLAGS := -DLINUX -DRELEASE -DRELEASE_X64 -DSTANDARD_FORTRAN

Shouldn't this be FPPFLAGS?


Can you send us a simple test case [with the makefile] that we can try to 
demonstrate this problem?

Satish

> 
> …
> 
> exe: $(OBJS) chkopts
> 
>     -${FLINKER} $(FFLAGS) $(FPPFLAGS) $(CPPFLAGS) -o $(EXENAME) 
> $(OBJS) ${PETSC_LIB} ${LIS_LIB} ${DLIB} ${SLIB}
> 
>  
> 
> Any idea on this problem?
> 
>  
> 
> All the best,
> 
>  
> 
>  
> 
> 


Re: [petsc-users] PETSc build asks for network connections

2023-04-28 Thread Satish Balay via petsc-users
Thanks for these notes.

FWIW - I don't see this issue on the CI MacOS boxes [2 Intel (default
install of catalina, upgraded to ventura) , 1 M1 (montery, managed by
admins)]

And I think it should be preferable to avoid these nasty scripts that
keep modifying the firewall rules [per petsc binary, with sudo for
each] - so if tweaking 'security' settings can accomplish that - and
we should recommend it.]

Satish

On Fri, 28 Apr 2023, Samar Khatiwala wrote:

> Hi,
> 
> I realize this is an old thread but I have some recent experience based on 
> setting up an M2 Mac that might be relevant.
> 
> I was dreading moving to Apple Silicon Macs because of issues like these but 
> I actually did not run into this particular problem.
> While I can’t be certain I think it is because in the process of installing 
> another piece of software I had to modify Apple’s security
> restrictions to make them more permissive. Details of how to do this are in 
> the following and it takes only a minute to implement:
> 
> https://rogueamoeba.com/support/knowledgebase/?showArticle=ACE-StepByStep=Audio+Hijack
> 
> Incidentally, I built mpich from source followed by PETSc in the usual way.
> 
> Something else that might be helpful for others is my experience getting 
> ifort to work. (My needs were somewhat specific: mixed
> fortran/C code, preferably ifort, and avoid package managers.) The intel 
> OneAPI installer ran smoothly (via rosetta) but when
> building mpich (or PETSc) I ran into an obvious problem: clang produces arm64 
> object files while ifort produces x86 ones. I couldn’t
> manage to set the correct CFLAGS to tell clang to target x86. Instead, the 
> (simpler) solution turned out to be (1) the fact that all the
> executables in Apple’s toolchain are universal binaries, and (2) the ‘arch’ 
> command can let you run programs for any of the two
> architectures. Specifically, executing in the terminal:
> 
> arch -x86_64 bash
> 
> starts a bash shell and *every* program that is then run from that shell is 
> automatically the x86 version. So I could then do:
> FC=ifort
> ./configure --prefix=/usr/local/mpichx86 --enable-two-level-namespace
> make
> sudo make install
> 
> and get an x86 build of mpich which I could then use (from the same shell or 
> a new one started as above) to build [x86] PETSc.
> Except for some annoying warnings from MKL (I think because it is confused 
> what architecture it is running on) everything runs
> smoothly and - even in emulation - surprisingly fast.
> 
> Sorry if this is all well know and already documented on PETSc’s install page.
> 
> Samar
> 
> On Mar 20, 2023, at 6:39 AM, Pierre Jolivet 
> mailto:pie...@joliv.et>> wrote:
> 
> 
> On 20 Mar 2023, at 2:45 AM, Barry Smith 
> mailto:bsm...@petsc.dev>> wrote:
> 
> 
>   I found a bit more information in gmakefile.test which has the magic sauce 
> used by make test to stop the firewall popups while running the test suite.
> 
> # MACOS FIREWALL HANDLING
> # - if run with MACOS_FIREWALL=1
> #   (automatically set in $PETSC_ARCH/lib/petsc/conf/petscvariables if 
> configured --with-macos-firewall-rules),
> #   ensure mpiexec and test executable is on firewall list
> #
> ifeq ($(MACOS_FIREWALL),1)
> FW := /usr/libexec/ApplicationFirewall/socketfilterfw
> # There is no reliable realpath command in macOS without need for 3rd party 
> tools like homebrew coreutils
> # Using Python's realpath seems like the most robust way here
> realpath-py = $(shell $(PYTHON) -c 'import os, sys; 
> print(os.path.realpath(sys.argv[1]))' $(1))
> #
> define macos-firewall-register
>   @APP=$(call realpath-py, $(1)); \
> if ! sudo -n true 2>/dev/null; then printf "Asking for sudo password to 
> add new firewall rule for\n  $$APP\n"; fi; \
> sudo $(FW) --remove $$APP --add $$APP --blockapp $$APP
> endef
> endif
> 
> and below. When building each executable it automatically calls 
> socketfilterfw on that executable so it won't popup.
> 
> From this I think you can reverse engineer how to turn it off for your 
> executables.
> 
> Perhaps PETSc's make ex1 etc should also apply this magic sauce, Pierre?
> 
> This configure option was added in 
> https://gitlab.com/petsc/petsc/-/merge_requests/3131 but it never worked on 
> my machines.
> I just tried again this morning a make check with MACOS_FIREWALL=1, it’s 
> asking for my password to register MPICH in the firewall, but the popups are 
> still appearing afterwards.
> That’s why I’ve never used that configure option and why I’m not sure if I 
> can trust this code from makefile.test, but I’m probably being paranoid.
> Prior to Ventura, when I was running the test suite, I manually disabled the 
> firewall https://support.apple.com/en-gb/guide/mac-help/mh11783/12.0/mac/12.0
> Apple has done yet again Apple things, and even if you disable the firewall 
> on Ventura 
> (https://support.apple.com/en-gb/guide/mac-help/mh11783/13.0/mac/13.0), the 
> popups are still appearing.
> Right now, I don’t have a solution, 

Re: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant

2023-04-26 Thread Satish Balay via petsc-users
Change at https://gitlab.com/petsc/petsc/-/merge_requests/6382

Satish

On Tue, 18 Apr 2023, Satish Balay via petsc-users wrote:

> I think its best if configure can handle this automatically (check for broken 
> compilers). Until then - perhaps we should use:
> 
> 
> diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h
> index dd75dbbc00b..dd9ef6791c5 100644
> --- a/include/petsc/private/vecimpl.h
> +++ b/include/petsc/private/vecimpl.h
> @@ -110,12 +110,7 @@ struct _VecOps {
>PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode);
>  };
>  
> -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11))
> -  #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23)
> -// static_assert() is a keyword since C23, before that defined as macro 
> in assert.h
> -#include 
> -  #endif
> -
> +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17))
>  static_assert(offsetof(struct _VecOps, duplicate) == sizeof(void (*)(void)) 
> * VECOP_DUPLICATE, "");
>  static_assert(offsetof(struct _VecOps, set) == sizeof(void (*)(void)) * 
> VECOP_SET, "");
>  static_assert(offsetof(struct _VecOps, view) == sizeof(void (*)(void)) * 
> VECOP_VIEW, "");
> 
> 
> Or just:
> 
> +#if defined(offsetof) && defined(__cplusplus)
> 
> Satish
> 
> On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote:
> 
> > This is a bug in GCC 9. Can you try the following:
> > 
> > $ make clean
> > $ make CFLAGS+='-std=gnu11’
> > 
> > Best regards,
> > 
> > Jacob Faibussowitsch
> > (Jacob Fai - booss - oh - vitch)
> > 
> > > On Apr 18, 2023, at 10:07, Zongze Yang  wrote:
> > > 
> > > No, it doesn't. It has the same problem. I just `make clean` and the 
> > > `make`. Do I need to reconfigure?
> > > 
> > > Best wishes,
> > > Zongze
> > > 
> > > 
> > > On Tue, 18 Apr 2023 at 21:09, Satish Balay  wrote:
> > > Does this change work?
> > > 
> > > diff --git a/include/petsc/private/vecimpl.h 
> > > b/include/petsc/private/vecimpl.h
> > > index dd75dbbc00b..168540b546e 100644
> > > --- a/include/petsc/private/vecimpl.h
> > > +++ b/include/petsc/private/vecimpl.h
> > > @@ -110,7 +110,7 @@ struct _VecOps {
> > >PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode);
> > >  };
> > > 
> > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 
> > > 11))
> > > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 
> > > 17))
> > >#if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23)
> > >  // static_assert() is a keyword since C23, before that defined as 
> > > macro in assert.h
> > >  #include 
> > > 
> > > 
> > > Satish
> > > 
> > > On Tue, 18 Apr 2023, Zongze Yang wrote:
> > > 
> > > > Hi, I am building petsc using gcc@9.5.0, and found the following error:
> > > > 
> > > > ```
> > > > In file included from /usr/include/alloca.h:25,
> > > >  from /usr/include/stdlib.h:497,
> > > >  from
> > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395,
> > > >  from
> > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7,
> > > >  from
> > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1:
> > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15:
> > > > error: expected declaration specifiers or '...' before 
> > > > '__builtin_offsetof'
> > > >   124 | static_assert(offsetof(struct _VecOps, loadnative) == 
> > > > sizeof(void
> > > > (*)(void)) * VECOP_LOADNATIVE, "");
> > > >   |   ^~~~
> > > > In file included from
> > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7:
> > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98:
> > > > error: expected declaration specifiers or '...' before string constant
> > > >   124 | static_assert(offsetof(struct _VecOps, loadnative) == 
> > > > sizeof(void
> > > > (*)(void)) * VECOP_LOADNATIVE, "");
> > > >   |
> > > >  ^~
> > > > ```
> > > > 
> > > > Could someone give me some hints to fix it? The configure.log and 
> > > > make.log
> > > > are attached.
> > > > 
> > > > 
> > > > Best wishes,
> > > > Zongze
> > > > 
> > > 
> > 
> 

  1   2   3   4   5   6   >