Hi Matt, attached is the file.
Thanks!
Marcos
________________________________
From: Matthew Knepley <[email protected]>
Sent: Monday, May 15, 2023 12:53 PM
To: Vanella, Marcos (Fed) <[email protected]>
Cc: [email protected] <[email protected]>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

     Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) 
<[email protected]<mailto:[email protected]>> wrote:
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
*******************Error detected during compile or link!*******************
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*********************************************************************************
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99    ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
                 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
                 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
                 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
                 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
                 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
                 from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
                                ^

Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 0 with PID 0 on node excess exited on signal 
11 (Segmentation fault: 11).
--------------------------------------------------------------------------
Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
See https://petsc.org/release/faq/
[excess:37831] *** Process received signal ***
[excess:37831] Signal: Segmentation fault: 11 (11)
[excess:37831] Signal code: Address not mapped (1)
[excess:37831] Failing at address: 0x7f
[excess:37831] *** End of error message ***
[excess:37832] *** Process received signal ***
[excess:37832] Signal: Segmentation fault: 11 (11)
[excess:37832] Signal code: Address not mapped (1)
[excess:37832] Failing at address: 0x7f
[excess:37832] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 1 with PID 0 on node excess exited on signal 
11 (Segmentation fault: 11).
--------------------------------------------------------------------------
Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI 
process
See https://petsc.org/release/faq/
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image              PC                Routine            Line        Source
libifcoremt.dylib  000000010B7F7FE4  for__signal_handl     Unknown  Unknown
libsystem_platfor  00007FF8024C25ED  _sigtramp             Unknown  Unknown
ex5f               00000001087AFA38  PetscGetArchType      Unknown  Unknown
ex5f               000000010887913B  PetscErrorPrintfI     Unknown  Unknown
ex5f               000000010878D227  PetscInitialize_C     Unknown  Unknown
ex5f               000000010879D289  petscinitializef_     Unknown  Unknown
ex5f               0000000108713C09  petscsys_mp_petsc     Unknown  Unknown
ex5f               0000000108710B5D  MAIN__                Unknown  Unknown
ex5f               0000000108710AEE  main                  Unknown  Unknown
dyld               00007FF80213B41F  start                 Unknown  Unknown
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus 
causing
the job to be terminated. The first process to do so was:

  Process name: [[48108,1],0]
  Exit code:    174
--------------------------------------------------------------------------
Completed test examples
Error while running make check
make[1]: *** [check] Error 1
make: *** [check] Error 2

________________________________
From: Vanella, Marcos (Fed) 
<[email protected]<mailto:[email protected]>>
Sent: Monday, May 15, 2023 12:20 PM
To: Matthew Knepley <[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Thank you Matt I'll try this and let you know.
Marcos
________________________________
From: Matthew Knepley <[email protected]<mailto:[email protected]>>
Sent: Monday, May 15, 2023 12:08 PM
To: Vanella, Marcos (Fed) 
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

On Mon, May 15, 2023 at 11:19 AM Vanella, Marcos (Fed) via petsc-users 
<[email protected]<mailto:[email protected]>> wrote:
Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
Ventura 13.3.1.
I can compile PETSc in debug mode with this configure and make lines. I can run 
the PETSC tests, which seem fine.
When I compile the library in optimized mode, either using -O3 or O1, for 
example configuring with:

I hate to yell "compiler bug" when this happens, but it sure seems like one. 
Can you just use

  --with-debugging=0

without the custom COPTFLAGS, CXXOPTFLAGS, FOPTFLAGS? If that works, it is 
almost
certainly a compiler bug. If not, then we can go in the debugger and see what 
is failing.

  Thanks,

    Matt

$ ./configure --prefix=/opt/petsc-oneapi22u3 
--with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
-diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
--with-shared-libraries=0 --download-make

and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib compiles. 
Yet, I see right off the bat this segfault error in the first PETSc example:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt test
/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
--no-print-directory -f 
/Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
PETSC_ARCH=arch-darwin-c-opt 
PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
/opt/intel/oneapi/intelpython/latest/bin/python3 
/Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
--petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
--petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
         CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
                 from 
/Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
                                ^

    CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
       TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
not ok sys_classes_draw_tests-ex1_1 # Error code: 139
#     [excess:98681] *** Process received signal ***
#     [excess:98681] Signal: Segmentation fault: 11 (11)
#     [excess:98681] Signal code: Address not mapped (1)
#     [excess:98681] Failing at address: 0x7f
#     [excess:98681] *** End of error message ***
#     --------------------------------------------------------------------------
#     Primary job  terminated normally, but 1 process returned
#     a non-zero exit code. Per user-direction, the job has been aborted.
#     --------------------------------------------------------------------------
#     --------------------------------------------------------------------------
#     mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
signal 11 (Segmentation fault: 11).
#     --------------------------------------------------------------------------
 ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff

I see the same segfault error in all PETSc examples.
Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is to 
use the linear solver from PETSc for the Poisson equation on our numerical 
scheme and test this on a GPU cluster. So also, any guideline on how to 
interface PETSc with a fortran code and personal experience is also most 
appreciated!

Marcos





--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
#if !defined(INCLUDED_PETSCCONF_H)
#define INCLUDED_PETSCCONF_H

#define PETSC_ARCH "arch-darwin-c-opt"
#define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size)))
#define PETSC_BLASLAPACK_UNDERSCORE 1
#define PETSC_CLANGUAGE_C 1
#define PETSC_CXX_RESTRICT __restrict
#define PETSC_DEPRECATED_ENUM(why)  
#define PETSC_DEPRECATED_FUNCTION(why) __attribute__((deprecated))
#define PETSC_DEPRECATED_MACRO(why) _Pragma(why)
#define PETSC_DEPRECATED_TYPEDEF(why) __attribute__((deprecated))
#define PETSC_DIR "/Users/mnv/Documents/Software/petsc-3.19.1"
#define PETSC_DIR_SEPARATOR '/'
#define PETSC_DO_NOT_SWAP_CHILD_FOR_DEBUGGER 1
#define PETSC_FORTRAN_CHARLEN_T int
#define PETSC_FORTRAN_TYPE_INITIALIZE  = -2
#define PETSC_FUNCTION_NAME_C __func__
#define PETSC_FUNCTION_NAME_CXX __func__
#define PETSC_HAVE_ACCESS 1
#define PETSC_HAVE_ATOLL 1
#define PETSC_HAVE_ATTRIBUTEALIGNED 1
#define PETSC_HAVE_BUILTIN_EXPECT 1
#define PETSC_HAVE_BZERO 1
#define PETSC_HAVE_C99_COMPLEX 1
#define PETSC_HAVE_CLOCK 1
#define PETSC_HAVE_CXX 1
#define PETSC_HAVE_CXX_ATOMIC 1
#define PETSC_HAVE_CXX_COMPLEX 1
#define PETSC_HAVE_CXX_COMPLEX_FIX 1
#define PETSC_HAVE_CXX_DIALECT_CXX11 1
#define PETSC_HAVE_CXX_DIALECT_CXX14 1
#define PETSC_HAVE_CXX_DIALECT_CXX17 1
#define PETSC_HAVE_DLADDR 1
#define PETSC_HAVE_DLCLOSE 1
#define PETSC_HAVE_DLERROR 1
#define PETSC_HAVE_DLFCN_H 1
#define PETSC_HAVE_DLOPEN 1
#define PETSC_HAVE_DLSYM 1
#define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1
#define PETSC_HAVE_DRAND48 1
#define PETSC_HAVE_DYNAMIC_LIBRARIES 1
#define PETSC_HAVE_ERF 1
#define PETSC_HAVE_EXECUTABLE_EXPORT 1
#define PETSC_HAVE_FCNTL_H 1
#define PETSC_HAVE_FENV_H 1
#define PETSC_HAVE_FE_VALUES 1
#define PETSC_HAVE_FLOAT_H 1
#define PETSC_HAVE_FORK 1
#define PETSC_HAVE_FORTRAN 1
#define PETSC_HAVE_FORTRAN_FLUSH 1
#define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1
#define PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT 1
#define PETSC_HAVE_FORTRAN_TYPE_STAR 1
#define PETSC_HAVE_FORTRAN_UNDERSCORE 1
#define PETSC_HAVE_GETCWD 1
#define PETSC_HAVE_GETDOMAINNAME 1
#define PETSC_HAVE_GETHOSTBYNAME 1
#define PETSC_HAVE_GETHOSTNAME 1
#define PETSC_HAVE_GETPAGESIZE 1
#define PETSC_HAVE_GETRUSAGE 1
#define PETSC_HAVE_GETWD 1
#define PETSC_HAVE_IMMINTRIN_H 1
#define PETSC_HAVE_INTTYPES_H 1
#define PETSC_HAVE_ISINF 1
#define PETSC_HAVE_ISNAN 1
#define PETSC_HAVE_ISNORMAL 1
#define PETSC_HAVE_LGAMMA 1
#define PETSC_HAVE_LOG2 1
#define PETSC_HAVE_LSEEK 1
#define PETSC_HAVE_MACHINE_ENDIAN_H 1
#define PETSC_HAVE_MAKE 1
#define PETSC_HAVE_MEMMOVE 1
#define PETSC_HAVE_MKL_INCLUDES 1
#define PETSC_HAVE_MKL_LIBS 1
#define PETSC_HAVE_MKSTEMP 1
#define PETSC_HAVE_MMAP 1
#define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP
#define PETSC_HAVE_MPIIO 1
#define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1
#define PETSC_HAVE_MPI_COMBINER_DUP 1
#define PETSC_HAVE_MPI_COMBINER_NAMED 1
#define PETSC_HAVE_MPI_F90MODULE 1
#define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1
#define PETSC_HAVE_MPI_GET_ACCUMULATE 1
#define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1
#define PETSC_HAVE_MPI_INIT_THREAD 1
#define PETSC_HAVE_MPI_INT64_T 1
#define PETSC_HAVE_MPI_LONG_DOUBLE 1
#define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1
#define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1
#define PETSC_HAVE_MPI_ONE_SIDED 1
#define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1
#define PETSC_HAVE_MPI_REDUCE_LOCAL 1
#define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1
#define PETSC_HAVE_MPI_RGET 1
#define PETSC_HAVE_MPI_WIN_CREATE 1
#define PETSC_HAVE_NANOSLEEP 1
#define PETSC_HAVE_NETDB_H 1
#define PETSC_HAVE_NETINET_IN_H 1
#define PETSC_HAVE_OMPI_MAJOR_VERSION 4
#define PETSC_HAVE_OMPI_MINOR_VERSION 1
#define PETSC_HAVE_OMPI_RELEASE_VERSION 4
#define PETSC_HAVE_PACKAGES ":blaslapack:make:mathlib:mpi:pthread:regex:x11:"
#define PETSC_HAVE_POPEN 1
#define PETSC_HAVE_POSIX_MEMALIGN 1
#define PETSC_HAVE_PTHREAD 1
#define PETSC_HAVE_PWD_H 1
#define PETSC_HAVE_RAND 1
#define PETSC_HAVE_READLINK 1
#define PETSC_HAVE_REALPATH 1
#define PETSC_HAVE_REGEX 1
#define PETSC_HAVE_RTLD_DEFAULT 1
#define PETSC_HAVE_RTLD_GLOBAL 1
#define PETSC_HAVE_RTLD_LAZY 1
#define PETSC_HAVE_RTLD_LOCAL 1
#define PETSC_HAVE_RTLD_NOW 1
#define PETSC_HAVE_SETJMP_H 1
#define PETSC_HAVE_SLEEP 1
#define PETSC_HAVE_SNPRINTF 1
#define PETSC_HAVE_SOCKET 1
#define PETSC_HAVE_SO_REUSEADDR 1
#define PETSC_HAVE_STDATOMIC_H 1
#define PETSC_HAVE_STDINT_H 1
#define PETSC_HAVE_STRCASECMP 1
#define PETSC_HAVE_STRINGS_H 1
#define PETSC_HAVE_STRUCT_SIGACTION 1
#define PETSC_HAVE_SYS_PARAM_H 1
#define PETSC_HAVE_SYS_RESOURCE_H 1
#define PETSC_HAVE_SYS_SOCKET_H 1
#define PETSC_HAVE_SYS_TIMES_H 1
#define PETSC_HAVE_SYS_TIME_H 1
#define PETSC_HAVE_SYS_TYPES_H 1
#define PETSC_HAVE_SYS_UTSNAME_H 1
#define PETSC_HAVE_SYS_WAIT_H 1
#define PETSC_HAVE_TAU_PERFSTUBS 1
#define PETSC_HAVE_TGAMMA 1
#define PETSC_HAVE_TIME 1
#define PETSC_HAVE_TIME_H 1
#define PETSC_HAVE_UNAME 1
#define PETSC_HAVE_UNISTD_H 1
#define PETSC_HAVE_USLEEP 1
#define PETSC_HAVE_VA_COPY 1
#define PETSC_HAVE_VSNPRINTF 1
#define PETSC_HAVE_X 1
#define PETSC_HAVE_XMMINTRIN_H 1
#define PETSC_HAVE__SLEEP 1
#define PETSC_HAVE___INT64 1
#define PETSC_INTPTR_T intptr_t
#define PETSC_INTPTR_T_FMT "#" PRIxPTR
#define PETSC_IS_COLORING_MAX USHRT_MAX
#define PETSC_IS_COLORING_VALUE_TYPE short
#define PETSC_IS_COLORING_VALUE_TYPE_F integer2
#define PETSC_LEVEL1_DCACHE_LINESIZE 64
#define PETSC_LIB_DIR 
"/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib"
#define PETSC_MAX_PATH_LEN 1024
#define PETSC_MEMALIGN 16
#define PETSC_MPICC_SHOW "icc -I/opt/openmpi414_oneapi22u3/include 
-L/opt/openmpi414_oneapi22u3/lib -lmpi -lopen-rte -lopen-pal -lm -lz"
#define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT
#define PETSC_OMAKE 
"/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
--no-print-directory"
#define PETSC_PREFETCH_HINT_NTA _MM_HINT_NTA
#define PETSC_PREFETCH_HINT_T0 _MM_HINT_T0
#define PETSC_PREFETCH_HINT_T1 _MM_HINT_T1
#define PETSC_PREFETCH_HINT_T2 _MM_HINT_T2
#define PETSC_PYTHON_EXE "/opt/intel/oneapi/intelpython/latest/bin/python3"
#define PETSC_Prefetch(a,b,c) _mm_prefetch((const char*)(a),(c))
#define PETSC_REPLACE_DIR_SEPARATOR '\\'
#define PETSC_SIGNAL_CAST  
#define PETSC_SIZEOF_INT 4
#define PETSC_SIZEOF_LONG 8
#define PETSC_SIZEOF_LONG_LONG 8
#define PETSC_SIZEOF_SIZE_T 8
#define PETSC_SIZEOF_VOID_P 8
#define PETSC_SLSUFFIX ""
#define PETSC_UINTPTR_T uintptr_t
#define PETSC_UINTPTR_T_FMT "#" PRIxPTR
#define PETSC_UNUSED __attribute((unused))
#define PETSC_USE_AVX512_KERNELS 1
#define PETSC_USE_BACKWARD_LOOP 1
#define PETSC_USE_CTABLE 1
#define PETSC_USE_DEBUGGER "lldb"
#define PETSC_USE_DMLANDAU_2D 1
#define PETSC_USE_INFO 1
#define PETSC_USE_ISATTY 1
#define PETSC_USE_LOG 1
#define PETSC_USE_MALLOC_COALESCED 1
#define PETSC_USE_REAL_DOUBLE 1
#define PETSC_USE_SINGLE_LIBRARY 1
#define PETSC_USE_SOCKET_VIEWER 1
#define PETSC_USING_64BIT_PTR 1
#define PETSC_USING_DARWIN 1
#define PETSC_USING_F2003 1
#define PETSC_USING_F90FREEFORM 1
#define PETSC__BSD_SOURCE 1
#define PETSC__DEFAULT_SOURCE 1
#define PETSC__GNU_SOURCE 1
#endif

Reply via email to