Hi Satish, well turns out this is not an M1 Mac, it is an older Intel Mac 
(2019).
I'm trying to get a local computer to do development and tests, but I also have 
access to linux clusters with GPU which we plan to go to next.
Thanks for the suggestion, I might also try compiling a gcc/gfortran version of 
the lib on this computer.
Marcos
________________________________
From: Satish Balay <[email protected]>
Sent: Monday, May 15, 2023 12:10 PM
To: Vanella, Marcos (Fed) <[email protected]>
Cc: [email protected] <[email protected]>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

I see Intel compilers here are building x86_64 binaries - that get run on the 
Arm M1 CPU - perhaps there are issues here with this mode of usage..

> I'm starting to work with PETSc. Our plan is to use the linear solver from 
> PETSc for the Poisson equation on our numerical scheme and test this on a GPU 
> cluster.

What does intel compilers provide you for this use case?

Why not use xcode/clang with gfortran here - i.e native ARM binaries?


Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> Ventura 13.3.1.
> I can compile PETSc in debug mode with this configure and make lines. I can 
> run the PETSC tests, which seem fine.
> When I compile the library in optimized mode, either using -O3 or O1, for 
> example configuring with:
>
> $ ./configure --prefix=/opt/petsc-oneapi22u3 
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> --with-shared-libraries=0 --download-make
>
> and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> compiles. Yet, I see right off the bat this segfault error in the first PETSc 
> example:
>
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> PETSC_ARCH=arch-darwin-c-opt test
> /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> --no-print-directory -f 
> /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> /opt/intel/oneapi/intelpython/latest/bin/python3 
> /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
>          CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> In file included from 
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>                  from 
> /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
>                                 ^
>
>     CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
>        TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> #     [excess:98681] *** Process received signal ***
> #     [excess:98681] Signal: Segmentation fault: 11 (11)
> #     [excess:98681] Signal code: Address not mapped (1)
> #     [excess:98681] Failing at address: 0x7f
> #     [excess:98681] *** End of error message ***
> #     
> --------------------------------------------------------------------------
> #     Primary job  terminated normally, but 1 process returned
> #     a non-zero exit code. Per user-direction, the job has been aborted.
> #     
> --------------------------------------------------------------------------
> #     
> --------------------------------------------------------------------------
> #     mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
> signal 11 (Segmentation fault: 11).
> #     
> --------------------------------------------------------------------------
>  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
>
> I see the same segfault error in all PETSc examples.
> Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is 
> to use the linear solver from PETSc for the Poisson equation on our numerical 
> scheme and test this on a GPU cluster. So also, any guideline on how to 
> interface PETSc with a fortran code and personal experience is also most 
> appreciated!
>
> Marcos
>
>
>
>

Reply via email to