Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-15 Thread Zongze Yang
Got it. Thank you for your explanation!

Best wishes,
Zongze


On Mon, 15 May 2023 at 23:28, Matthew Knepley  wrote:

> On Mon, May 15, 2023 at 9:55 AM Zongze Yang  wrote:
>
>> On Mon, 15 May 2023 at 17:24, Matthew Knepley  wrote:
>>
>>> On Sun, May 14, 2023 at 7:23 PM Zongze Yang 
>>> wrote:
>>>
 Could you try to project the coordinates into the continuity space by
 enabling the option
 `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?

>>>
>>> There is a comment in the code about that:
>>>
>>>   /* XXX FIXME Requires DMPlexSetClosurePermutationLexicographic() */
>>>
>>> So what is currently done is you project into the discontinuous space
>>> from the GMsh coordinates,
>>> and then we get the continuous coordinates from those later. This is why
>>> we get the right answer.
>>>
>>>
>> Sorry, I'm having difficulty understanding the comment and fully
>> understanding your intended meaning. Are you saying that we can only
>> project the space to a discontinuous space?
>>
>
> For higher order simplices, because we do not have the mapping to the GMsh
> order yet.
>
>
>> Additionally, should we always set
>> `dm_plex_gmsh_project_petscdualspace_lagrange_continuity` to false for
>> high-order gmsh files?
>>
>
> This is done automatically if you do not override it.
>
>
>> With the option set to `true`, I got the following error:
>>
>
> Yes, do not do that.
>
>   Thanks,
>
>  Matt
>
>
>> ```
>> $ $PETSC_DIR/$PETSC_ARCH/tests/dm/impls/plex/tests/runex33_gmsh_3d_q2.sh
>> -e "-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true"
>> not ok dm_impls_plex_tests-ex33_gmsh_3d_q2 # Error code: 77
>> #   Volume: 0.46875
>> #   [0]PETSC ERROR: - Error Message
>> --
>> #   [0]PETSC ERROR: Petsc has generated inconsistent data
>> #   [0]PETSC ERROR: Calculated volume 0.46875 != 1. actual volume
>> (error 0.53125 > 1e-06 tol)
>> #   [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
>> shooting.
>> #   [0]PETSC ERROR: Petsc Development GIT revision:
>> v3.19.1-294-g9cc24bc9b93  GIT Date: 2023-05-15 12:07:10 +
>> #   [0]PETSC ERROR: ../ex33 on a arch-linux-c-debug named AMA-PC-RA18
>> by yzz Mon May 15 21:53:43 2023
>> #   [0]PETSC ERROR: Configure options
>> --CFLAGS=-I/opt/intel/oneapi/mkl/latest/include
>> --CXXFLAGS=-I/opt/intel/oneapi/mkl/latest/include
>> --LDFLAGS="-Wl,-rpath,/opt/intel/oneapi/mkl/latest/lib/intel64
>> -L/opt/intel/oneapi/mkl/latest/lib/intel64" --download-bison
>> --download-chaco --download-cmake
>> --download-eigen="/home/yzz/firedrake/complex-int32-mkl-X-debug/src/eigen-3.3.3.tgz
>> " --download-fftw --download-hdf5 --download-hpddm --download-hwloc
>> --download-libpng --download-metis --download-mmg --download-mpich
>> --download-mumps --download-netcdf --download-p4est --download-parmmg
>> --download-pastix --download-pnetcdf --download-ptscotch
>> --download-scalapack --download-slepc --download-suitesparse
>> --download-superlu_dist --download-tetgen --download-triangle
>> --with-blaslapack-dir=/opt/intel/oneapi/mkl/latest --with-c2html=0
>> --with-debugging=1 --with-fortran-bindings=0
>> --with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/latest
>> --with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/latest
>> --with-scalar-type=complex --with-shared-libraries=1 --with-x=1 --with-zlib
>> PETSC_ARCH=arch-linux-c-debug
>> #   [0]PETSC ERROR: #1 CheckVolume() at
>> /home/yzz/opt/petsc/src/dm/impls/plex/tests/ex33.c:246
>> #   [0]PETSC ERROR: #2 main() at
>> /home/yzz/opt/petsc/src/dm/impls/plex/tests/ex33.c:261
>> #   [0]PETSC ERROR: PETSc Option Table entries:
>> #   [0]PETSC ERROR: -coord_space 0 (source: command line)
>> #   [0]PETSC ERROR: -dm_plex_filename
>> /home/yzz/opt/petsc/share/petsc/datafiles/meshes/cube_q2.msh (source:
>> command line)
>> #   [0]PETSC ERROR: -dm_plex_gmsh_project (source: command line)
>> #   [0]PETSC ERROR:
>> -dm_plex_gmsh_project_petscdualspace_lagrange_continuity true (source:
>> command line)
>> #   [0]PETSC ERROR: -tol 1e-6 (source: command line)
>> #   [0]PETSC ERROR: -volume 1.0 (source: command line)
>> #   [0]PETSC ERROR: End of Error Message ---send
>> entire error message to petsc-ma...@mcs.anl.gov--
>> #   application called MPI_Abort(MPI_COMM_SELF, 77) - process 0
>>  ok dm_impls_plex_tests-ex33_gmsh_3d_q2 # SKIP Command failed so no diff
>> ```
>>
>>
>> Best wishes,
>> Zongze
>>
>>   Thanks,
>>>
>>>  Matt
>>>
>>>
 Best wishes,
 Zongze


 On Mon, 15 May 2023 at 04:24, Matthew Knepley 
 wrote:

> On Sun, May 14, 2023 at 12:27 PM Zongze Yang 
> wrote:
>
>>
>>
>>
>> On Sun, 14 May 2023 at 23:54, Matthew Knepley 
>> wrote:
>>
>>> On Sun, May 14, 2023 at 9:21 AM Zongze Yang 
>>> wrote:
>>>
 Hi, Matt,

 The issue has been resolved 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Thank you Matt and Samar,
Seems the segfaults I see are related to icc, which is not being updated 
anymore. The recommended intel C compiler is icx which is not released for 
Macs. I compiled the lib with gcc 13 + openmpi from homebrew and the tests are 
passing just fine in optimized mode. I will follow Samars comment and build 
openmpi with clang + ifort and check PETSc works fine with it. Might be time to 
get rid of icc in our bundle building process for Macs, I keep getting this 
warning:

icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler.

Thanks for the help!
Marcos

From: Samar Khatiwala 
Sent: Monday, May 15, 2023 1:22 PM
To: Vanella, Marcos (Fed) 
Cc: Matthew Knepley ; petsc-users@mcs.anl.gov 

Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Hi Marcos,

Yes, I compiled with clang instead of icc (no particular reason for this; I 
tend to use gcc/clang). I use mpich4.1.1, which I first built with clang and 
ifort:


FC=ifort

./configure --prefix=/usr/local/mpich4 --enable-two-level-namespace


Samar

On May 15, 2023, at 6:07 PM, Vanella, Marcos (Fed)  
wrote:

Hi Samar, what MPI library do you use? Did you compile it with clang instead of 
icc?
Thanks,
Marcos

From: Samar Khatiwala 
Sent: Monday, May 15, 2023 1:05 PM
To: Matthew Knepley 
Cc: Vanella, Marcos (Fed) ; petsc-users@mcs.anl.gov 

Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Hi, for what it’s worth, clang + ifort from OneAPI 2023 update 1 works fine for 
me on both Intel and M2 Macs. So it might just be a matter of upgrading.

Samar

On May 15, 2023, at 5:53 PM, Matthew Knepley  wrote:

Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

 Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) 
mailto:marcos.vane...@nist.gov>> wrote:
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
***Error detected during compile or link!***
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Samar Khatiwala
Hi Marcos,

Yes, I compiled with clang instead of icc (no particular reason for this; I 
tend to use gcc/clang). I use mpich4.1.1, which I first built with clang and 
ifort:


FC=ifort

./configure --prefix=/usr/local/mpich4 --enable-two-level-namespace


Samar

On May 15, 2023, at 6:07 PM, Vanella, Marcos (Fed)  
wrote:

Hi Samar, what MPI library do you use? Did you compile it with clang instead of 
icc?
Thanks,
Marcos

From: Samar Khatiwala 
Sent: Monday, May 15, 2023 1:05 PM
To: Matthew Knepley 
Cc: Vanella, Marcos (Fed) ; petsc-users@mcs.anl.gov 

Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Hi, for what it’s worth, clang + ifort from OneAPI 2023 update 1 works fine for 
me on both Intel and M2 Macs. So it might just be a matter of upgrading.

Samar

On May 15, 2023, at 5:53 PM, Matthew Knepley  wrote:

Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

 Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) 
mailto:marcos.vane...@nist.gov>> wrote:
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
***Error detected during compile or link!***
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
 from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Matthew Knepley
On Mon, May 15, 2023 at 1:04 PM Vanella, Marcos (Fed) <
marcos.vane...@nist.gov> wrote:

> Hi Matt, attached is the file.
>

Okay, you are failing in this function

PetscErrorCode PetscGetArchType(char str[], size_t slen)
{
  PetscFunctionBegin;
#if defined(PETSC_ARCH)
  PetscCall(PetscStrncpy(str, PETSC_ARCH, slen - 1));
#else
  #error "$PETSC_ARCH/include/petscconf.h is missing PETSC_ARCH"
#endif
  PetscFunctionReturn(PETSC_SUCCESS);
}

How PETSC_ARCH is defined in the header you sent, so it is likely that some
other header is being picked up by mistake from some other, broken build.

I would completely clean out your PETSc installation and start from scratch.

  Thanks,

  Matt


> Thanks!
> Marcos
> --
> *From:* Matthew Knepley 
> *Sent:* Monday, May 15, 2023 12:53 PM
> *To:* Vanella, Marcos (Fed) 
> *Cc:* petsc-users@mcs.anl.gov 
> *Subject:* Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers
> and OpenMPI
>
> Send us
>
>   $PETSC_ARCH/include/petscconf.h
>
>   Thanks,
>
>  Matt
>
> On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) <
> marcos.vane...@nist.gov> wrote:
>
> Hi Matt, I configured the lib like this:
>
> $ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1
> --with-debugging=0 --with-shared-libraries=0 --download-make
>
> and compiled. I still get some check segfault error. See below:
>
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
> PETSC_ARCH=arch-darwin-c-opt check
> Running check examples to verify correct installation
> Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and
> PETSC_ARCH=arch-darwin-c-opt
> ***Error detected during compile or
> link!***
> See https://petsc.org/release/faq/
> /Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
>
> *
> mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress
> -Wl,-multiply_defined -Wl,suppress -Wl,-commons,use_dylibs
> -Wl,-search_paths_first -Wl,-no_compact_unwind  -fPIC -wd1572
> -Wno-unknown-pragmas -g -O3
>  -I/Users/mnv/Documents/Software/petsc-3.19.1/include
> -I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include
> -I/opt/X11/include  -std=c99ex19.c
>  -L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib
> -Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib
> -L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib
> -L/opt/openmpi414_oneapi22u3/lib
> -Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib
> -L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib
> -L/opt/intel/oneapi/ipp/2021.6.2/lib
> -L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib
> -L/opt/intel/oneapi/dal/2021.7.1/lib
> -L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib
> -L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib
> -Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
> -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
> -Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin
> -L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin
> -lpetsc -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11
> -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte
> -lopen-pal -limf -lm -lz -lifport -lifcoremt -lsvml -lipgo -lirc -lpthread
> -lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng
> -lc++ -lipgo -ldecimal -lirc -lclang_rt.osx -lmpi -lopen-rte -lopen-pal
> -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc -lclang_rt.osx -o
> ex19
> icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated
> and will be removed from product release in the second half of 2023. The
> Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving
> forward. Please transition to use this compiler. Use '-diag-disable=10441'
> to disable this message.
> In file included from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
>  from ex19.c(68):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68):
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> ^
>
> Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
> See https://petsc.org/release/faq/
> [excess:37807] *** Process received 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Hi Samar, what MPI library do you use? Did you compile it with clang instead of 
icc?
Thanks,
Marcos

From: Samar Khatiwala 
Sent: Monday, May 15, 2023 1:05 PM
To: Matthew Knepley 
Cc: Vanella, Marcos (Fed) ; petsc-users@mcs.anl.gov 

Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Hi, for what it’s worth, clang + ifort from OneAPI 2023 update 1 works fine for 
me on both Intel and M2 Macs. So it might just be a matter of upgrading.

Samar

On May 15, 2023, at 5:53 PM, Matthew Knepley  wrote:

Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

 Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) 
mailto:marcos.vane...@nist.gov>> wrote:
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
***Error detected during compile or link!***
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
 from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message ***
--
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Samar Khatiwala
Hi, for what it’s worth, clang + ifort from OneAPI 2023 update 1 works fine for 
me on both Intel and M2 Macs. So it might just be a matter of upgrading.

Samar

On May 15, 2023, at 5:53 PM, Matthew Knepley  wrote:

Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

 Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) 
mailto:marcos.vane...@nist.gov>> wrote:
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
***Error detected during compile or link!***
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
 from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message ***
--
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--
--
mpiexec noticed that process rank 0 with PID 0 on node excess exited on signal 
11 (Segmentation fault: 11).
--
Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
See https://petsc.org/release/faq/
[excess:37831] *** Process 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Hi Matt, attached is the file.
Thanks!
Marcos

From: Matthew Knepley 
Sent: Monday, May 15, 2023 12:53 PM
To: Vanella, Marcos (Fed) 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

 Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) 
mailto:marcos.vane...@nist.gov>> wrote:
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
***Error detected during compile or link!***
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
 from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message ***
--
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--
--
mpiexec noticed that process rank 0 with PID 0 on node excess exited on signal 
11 (Segmentation fault: 11).
--
Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Hi Satish, yes the -m64 flag tells the compilers the target cpu is intel 64.

The only reason I'm trying to get PETSc working with intel is that the bundles 
for the software we release use Intel compilers for Linux, Mac and Windows 
(OneAPI intelMPI for linux and Windows, OpenMPI compiled with intel for MacOS). 
I'm just trying to get PETSc compiled with intel to maintain the scheme we have 
and keep these compilers, which would be handy if we are to release an 
alternative Poisson solver using PETSc in the future.
For our research projects I'm thinking we'll use gcc/openmpi in linux clusters.

Marcos

From: Satish Balay 
Sent: Monday, May 15, 2023 12:48 PM
To: Vanella, Marcos (Fed) 
Cc: petsc-users 
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

Ops - for some reason I assumed this build is on Mac M1. [likely due to the 
usage of '-m64' - that was strange]..

But yeah - our general usage on Mac is with xcode/clang and brew gfortran (on 
both Intel and ARM CPUs) - and unless you need Intel compilers for specific 
needs - clang/gfortran should work better for this development work.

Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hi Satish, well turns out this is not an M1 Mac, it is an older Intel Mac 
> (2019).
> I'm trying to get a local computer to do development and tests, but I also 
> have access to linux clusters with GPU which we plan to go to next.
> Thanks for the suggestion, I might also try compiling a gcc/gfortran version 
> of the lib on this computer.
> Marcos
> 
> From: Satish Balay 
> Sent: Monday, May 15, 2023 12:10 PM
> To: Vanella, Marcos (Fed) 
> Cc: petsc-users@mcs.anl.gov 
> Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
> OpenMPI
>
> I see Intel compilers here are building x86_64 binaries - that get run on the 
> Arm M1 CPU - perhaps there are issues here with this mode of usage..
>
> > I'm starting to work with PETSc. Our plan is to use the linear solver from 
> > PETSc for the Poisson equation on our numerical scheme and test this on a 
> > GPU cluster.
>
> What does intel compilers provide you for this use case?
>
> Why not use xcode/clang with gfortran here - i.e native ARM binaries?
>
>
> Satish
>
> On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:
>
> > Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> > 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> > Ventura 13.3.1.
> > I can compile PETSc in debug mode with this configure and make lines. I can 
> > run the PETSC tests, which seem fine.
> > When I compile the library in optimized mode, either using -O3 or O1, for 
> > example configuring with:
> >
> > $ ./configure --prefix=/opt/petsc-oneapi22u3 
> > --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> > -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> > FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> > --with-shared-libraries=0 --download-make
> >
> > and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> > compiles. Yet, I see right off the bat this segfault error in the first 
> > PETSc example:
> >
> > $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> > PETSC_ARCH=arch-darwin-c-opt test
> > /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> > --no-print-directory -f 
> > /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> > PETSC_ARCH=arch-darwin-c-opt 
> > PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> > /opt/intel/oneapi/intelpython/latest/bin/python3 
> > /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> > --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> > --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> > Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> > PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
> >  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> > In file included from 
> > /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
> >  from 
> > /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> > /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> > warning #2621: attribute "warn_unused_result" does not apply here
> >   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> > ^
> >
> > CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
> >TEST 
> > arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> > not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> > # [excess:98681] *** Process received signal ***
> > # [excess:98681] Signal: Segmentation fault: 11 (11)
> > # [excess:98681] Signal code: Address not mapped (1)
> > # 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Matthew Knepley
Send us

  $PETSC_ARCH/include/petscconf.h

  Thanks,

 Matt

On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) <
marcos.vane...@nist.gov> wrote:

> Hi Matt, I configured the lib like this:
>
> $ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1
> --with-debugging=0 --with-shared-libraries=0 --download-make
>
> and compiled. I still get some check segfault error. See below:
>
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
> PETSC_ARCH=arch-darwin-c-opt check
> Running check examples to verify correct installation
> Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and
> PETSC_ARCH=arch-darwin-c-opt
> ***Error detected during compile or
> link!***
> See https://petsc.org/release/faq/
> /Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
>
> *
> mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress
> -Wl,-multiply_defined -Wl,suppress -Wl,-commons,use_dylibs
> -Wl,-search_paths_first -Wl,-no_compact_unwind  -fPIC -wd1572
> -Wno-unknown-pragmas -g -O3
>  -I/Users/mnv/Documents/Software/petsc-3.19.1/include
> -I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include
> -I/opt/X11/include  -std=c99ex19.c
>  -L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib
> -Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib
> -L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib
> -L/opt/openmpi414_oneapi22u3/lib
> -Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib
> -L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib
> -L/opt/intel/oneapi/ipp/2021.6.2/lib
> -L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib
> -L/opt/intel/oneapi/dal/2021.7.1/lib
> -L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib
> -L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib
> -Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
> -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
> -Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin
> -L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin
> -lpetsc -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11
> -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte
> -lopen-pal -limf -lm -lz -lifport -lifcoremt -lsvml -lipgo -lirc -lpthread
> -lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng
> -lc++ -lipgo -ldecimal -lirc -lclang_rt.osx -lmpi -lopen-rte -lopen-pal
> -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc -lclang_rt.osx -o
> ex19
> icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated
> and will be removed from product release in the second half of 2023. The
> Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving
> forward. Please transition to use this compiler. Use '-diag-disable=10441'
> to disable this message.
> In file included from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
>  from ex19.c(68):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68):
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> ^
>
> Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
> See https://petsc.org/release/faq/
> [excess:37807] *** Process received signal ***
> [excess:37807] Signal: Segmentation fault: 11 (11)
> [excess:37807] Signal code: Address not mapped (1)
> [excess:37807] Failing at address: 0x7f
> [excess:37807] *** End of error message ***
> --
> Primary job  terminated normally, but 1 process returned
> a non-zero exit code. Per user-direction, the job has been aborted.
> --
> --
> mpiexec noticed that process rank 0 with PID 0 on node excess exited on
> signal 11 (Segmentation fault: 11).
> --
> Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
> See https://petsc.org/release/faq/
> [excess:37831] *** Process received signal ***
> [excess:37831] Signal: Segmentation fault: 11 (11)
> [excess:37831] 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Hi Matt, I configured the lib like this:

$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 
--with-debugging=0 --with-shared-libraries=0 --download-make

and compiled. I still get some check segfault error. See below:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and 
PETSC_ARCH=arch-darwin-c-opt
***Error detected during compile or link!***
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined 
-Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first 
-Wl,-no_compact_unwind  -fPIC -wd1572 -Wno-unknown-pragmas -g -O3  
-I/Users/mnv/Documents/Software/petsc-3.19.1/include 
-I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include 
-I/opt/X11/include  -std=c99ex19.c  
-L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib 
-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib 
-L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib 
-L/opt/openmpi414_oneapi22u3/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib 
-L/opt/intel/oneapi/ipp/2021.6.2/lib 
-L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib 
-L/opt/intel/oneapi/dal/2021.7.1/lib 
-L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib 
-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib 
-Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib
 -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib 
-Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin 
-L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc 
-lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz 
-lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi 
-lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc 
-lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ 
-lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and 
will be removed from product release in the second half of 2023. The Intel(R) 
oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. 
Please transition to use this compiler. Use '-diag-disable=10441' to disable 
this message.
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
 from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message ***
--
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--
--
mpiexec noticed that process rank 0 with PID 0 on node excess exited on signal 
11 (Segmentation fault: 11).
--
Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
See https://petsc.org/release/faq/
[excess:37831] *** Process received signal ***
[excess:37831] Signal: Segmentation fault: 11 (11)
[excess:37831] Signal code: Address not mapped (1)
[excess:37831] Failing at address: 0x7f
[excess:37831] *** End of error message ***
[excess:37832] *** Process received signal ***
[excess:37832] Signal: Segmentation fault: 11 (11)
[excess:37832] Signal code: Address not mapped (1)
[excess:37832] Failing at 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Satish Balay via petsc-users
Ops - for some reason I assumed this build is on Mac M1. [likely due to the 
usage of '-m64' - that was strange]..

But yeah - our general usage on Mac is with xcode/clang and brew gfortran (on 
both Intel and ARM CPUs) - and unless you need Intel compilers for specific 
needs - clang/gfortran should work better for this development work.

Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hi Satish, well turns out this is not an M1 Mac, it is an older Intel Mac 
> (2019).
> I'm trying to get a local computer to do development and tests, but I also 
> have access to linux clusters with GPU which we plan to go to next.
> Thanks for the suggestion, I might also try compiling a gcc/gfortran version 
> of the lib on this computer.
> Marcos
> 
> From: Satish Balay 
> Sent: Monday, May 15, 2023 12:10 PM
> To: Vanella, Marcos (Fed) 
> Cc: petsc-users@mcs.anl.gov 
> Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
> OpenMPI
> 
> I see Intel compilers here are building x86_64 binaries - that get run on the 
> Arm M1 CPU - perhaps there are issues here with this mode of usage..
> 
> > I'm starting to work with PETSc. Our plan is to use the linear solver from 
> > PETSc for the Poisson equation on our numerical scheme and test this on a 
> > GPU cluster.
> 
> What does intel compilers provide you for this use case?
> 
> Why not use xcode/clang with gfortran here - i.e native ARM binaries?
> 
> 
> Satish
> 
> On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:
> 
> > Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> > 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> > Ventura 13.3.1.
> > I can compile PETSc in debug mode with this configure and make lines. I can 
> > run the PETSC tests, which seem fine.
> > When I compile the library in optimized mode, either using -O3 or O1, for 
> > example configuring with:
> >
> > $ ./configure --prefix=/opt/petsc-oneapi22u3 
> > --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> > -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> > FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> > --with-shared-libraries=0 --download-make
> >
> > and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> > compiles. Yet, I see right off the bat this segfault error in the first 
> > PETSc example:
> >
> > $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> > PETSC_ARCH=arch-darwin-c-opt test
> > /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> > --no-print-directory -f 
> > /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> > PETSC_ARCH=arch-darwin-c-opt 
> > PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> > /opt/intel/oneapi/intelpython/latest/bin/python3 
> > /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> > --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> > --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> > Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> > PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
> >  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> > In file included from 
> > /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
> >  from 
> > /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> > /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> > warning #2621: attribute "warn_unused_result" does not apply here
> >   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> > ^
> >
> > CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
> >TEST 
> > arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> > not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> > # [excess:98681] *** Process received signal ***
> > # [excess:98681] Signal: Segmentation fault: 11 (11)
> > # [excess:98681] Signal code: Address not mapped (1)
> > # [excess:98681] Failing at address: 0x7f
> > # [excess:98681] *** End of error message ***
> > # 
> > --
> > # Primary job  terminated normally, but 1 process returned
> > # a non-zero exit code. Per user-direction, the job has been aborted.
> > # 
> > --
> > # 
> > --
> > # mpiexec noticed that process rank 0 with PID 0 on node excess exited 
> > on signal 11 (Segmentation fault: 11).
> > # 
> > --
> >  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
> >
> > I see the same segfault error 

Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Hi Satish, well turns out this is not an M1 Mac, it is an older Intel Mac 
(2019).
I'm trying to get a local computer to do development and tests, but I also have 
access to linux clusters with GPU which we plan to go to next.
Thanks for the suggestion, I might also try compiling a gcc/gfortran version of 
the lib on this computer.
Marcos

From: Satish Balay 
Sent: Monday, May 15, 2023 12:10 PM
To: Vanella, Marcos (Fed) 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

I see Intel compilers here are building x86_64 binaries - that get run on the 
Arm M1 CPU - perhaps there are issues here with this mode of usage..

> I'm starting to work with PETSc. Our plan is to use the linear solver from 
> PETSc for the Poisson equation on our numerical scheme and test this on a GPU 
> cluster.

What does intel compilers provide you for this use case?

Why not use xcode/clang with gfortran here - i.e native ARM binaries?


Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> Ventura 13.3.1.
> I can compile PETSc in debug mode with this configure and make lines. I can 
> run the PETSC tests, which seem fine.
> When I compile the library in optimized mode, either using -O3 or O1, for 
> example configuring with:
>
> $ ./configure --prefix=/opt/petsc-oneapi22u3 
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> --with-shared-libraries=0 --download-make
>
> and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> compiles. Yet, I see right off the bat this segfault error in the first PETSc 
> example:
>
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> PETSC_ARCH=arch-darwin-c-opt test
> /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> --no-print-directory -f 
> /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> /opt/intel/oneapi/intelpython/latest/bin/python3 
> /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
>  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> In file included from 
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>  from 
> /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> ^
>
> CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
>TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> # [excess:98681] *** Process received signal ***
> # [excess:98681] Signal: Segmentation fault: 11 (11)
> # [excess:98681] Signal code: Address not mapped (1)
> # [excess:98681] Failing at address: 0x7f
> # [excess:98681] *** End of error message ***
> # 
> --
> # Primary job  terminated normally, but 1 process returned
> # a non-zero exit code. Per user-direction, the job has been aborted.
> # 
> --
> # 
> --
> # mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
> signal 11 (Segmentation fault: 11).
> # 
> --
>  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
>
> I see the same segfault error in all PETSc examples.
> Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is 
> to use the linear solver from PETSc for the Poisson equation on our numerical 
> scheme and test this on a GPU cluster. So also, any guideline on how to 
> interface PETSc with a fortran code and personal experience is also most 
> appreciated!
>
> Marcos
>
>
>
>


Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Thank you Matt I'll try this and let you know.
Marcos

From: Matthew Knepley 
Sent: Monday, May 15, 2023 12:08 PM
To: Vanella, Marcos (Fed) 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and 
OpenMPI

On Mon, May 15, 2023 at 11:19 AM Vanella, Marcos (Fed) via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
Ventura 13.3.1.
I can compile PETSc in debug mode with this configure and make lines. I can run 
the PETSC tests, which seem fine.
When I compile the library in optimized mode, either using -O3 or O1, for 
example configuring with:

I hate to yell "compiler bug" when this happens, but it sure seems like one. 
Can you just use

  --with-debugging=0

without the custom COPTFLAGS, CXXOPTFLAGS, FOPTFLAGS? If that works, it is 
almost
certainly a compiler bug. If not, then we can go in the debugger and see what 
is failing.

  Thanks,

Matt

$ ./configure --prefix=/opt/petsc-oneapi22u3 
--with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
-diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
--with-shared-libraries=0 --download-make

and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib compiles. 
Yet, I see right off the bat this segfault error in the first PETSc example:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt test
/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
--no-print-directory -f 
/Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
PETSC_ARCH=arch-darwin-c-opt 
PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
/opt/intel/oneapi/intelpython/latest/bin/python3 
/Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
--petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
--petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
 CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
   TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
not ok sys_classes_draw_tests-ex1_1 # Error code: 139
# [excess:98681] *** Process received signal ***
# [excess:98681] Signal: Segmentation fault: 11 (11)
# [excess:98681] Signal code: Address not mapped (1)
# [excess:98681] Failing at address: 0x7f
# [excess:98681] *** End of error message ***
# --
# Primary job  terminated normally, but 1 process returned
# a non-zero exit code. Per user-direction, the job has been aborted.
# --
# --
# mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
signal 11 (Segmentation fault: 11).
# --
 ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff

I see the same segfault error in all PETSc examples.
Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is to 
use the linear solver from PETSc for the Poisson equation on our numerical 
scheme and test this on a GPU cluster. So also, any guideline on how to 
interface PETSc with a fortran code and personal experience is also most 
appreciated!

Marcos





--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Satish Balay via petsc-users
I see Intel compilers here are building x86_64 binaries - that get run on the 
Arm M1 CPU - perhaps there are issues here with this mode of usage..

> I'm starting to work with PETSc. Our plan is to use the linear solver from 
> PETSc for the Poisson equation on our numerical scheme and test this on a GPU 
> cluster.

What does intel compilers provide you for this use case?

Why not use xcode/clang with gfortran here - i.e native ARM binaries?


Satish

On Mon, 15 May 2023, Vanella, Marcos (Fed) via petsc-users wrote:

> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
> Ventura 13.3.1.
> I can compile PETSc in debug mode with this configure and make lines. I can 
> run the PETSC tests, which seem fine.
> When I compile the library in optimized mode, either using -O3 or O1, for 
> example configuring with:
> 
> $ ./configure --prefix=/opt/petsc-oneapi22u3 
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
> -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
> FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
> --with-shared-libraries=0 --download-make
> 
> and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib 
> compiles. Yet, I see right off the bat this segfault error in the first PETSc 
> example:
> 
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
> PETSC_ARCH=arch-darwin-c-opt test
> /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
> --no-print-directory -f 
> /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
> PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> /opt/intel/oneapi/intelpython/latest/bin/python3 
> /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
> --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
> --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
>  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> In file included from 
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>  from 
> /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): 
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> ^
> 
> CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
>TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> not ok sys_classes_draw_tests-ex1_1 # Error code: 139
> # [excess:98681] *** Process received signal ***
> # [excess:98681] Signal: Segmentation fault: 11 (11)
> # [excess:98681] Signal code: Address not mapped (1)
> # [excess:98681] Failing at address: 0x7f
> # [excess:98681] *** End of error message ***
> # 
> --
> # Primary job  terminated normally, but 1 process returned
> # a non-zero exit code. Per user-direction, the job has been aborted.
> # 
> --
> # 
> --
> # mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
> signal 11 (Segmentation fault: 11).
> # 
> --
>  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
> 
> I see the same segfault error in all PETSc examples.
> Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is 
> to use the linear solver from PETSc for the Poisson equation on our numerical 
> scheme and test this on a GPU cluster. So also, any guideline on how to 
> interface PETSc with a fortran code and personal experience is also most 
> appreciated!
> 
> Marcos
> 
> 
> 
> 


Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Matthew Knepley
On Mon, May 15, 2023 at 11:19 AM Vanella, Marcos (Fed) via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX
> Ventura 13.3.1.
> I can compile PETSc in debug mode with this configure and make lines. I
> can run the PETSC tests, which seem fine.
> When I compile the library in optimized mode, either using -O3 or O1, for
> example configuring with:
>

I hate to yell "compiler bug" when this happens, but it sure seems like
one. Can you just use

  --with-debugging=0

without the custom COPTFLAGS, CXXOPTFLAGS, FOPTFLAGS? If that works, it is
almost
certainly a compiler bug. If not, then we can go in the debugger and see
what is failing.

  Thanks,

Matt


> $ ./configure --prefix=/opt/petsc-oneapi22u3
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g
> -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441'
> FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0
> --with-shared-libraries=0 --download-make
>
> and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib
> compiles. Yet, I see right off the bat this segfault error in the first
> PETSc example:
>
> $ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
> PETSC_ARCH=arch-darwin-c-opt test
> /Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make
> --no-print-directory -f
> /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test
> PETSC_ARCH=arch-darwin-c-opt
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
> /opt/intel/oneapi/intelpython/latest/bin/python3
> /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py
> --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1
> --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
> Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt
> PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
>  CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
> In file included from
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
>  from
> /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
> /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68):
> warning #2621: attribute "warn_unused_result" does not apply here
>   PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
> ^
>
> CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
>TEST
> arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
> not ok sys_classes_draw_tests-ex1_1 *# Error code: 139*
> *# [excess:98681] *** Process received signal 
> *# [excess:98681] Signal: Segmentation fault: 11 (11)*
> *# [excess:98681] Signal code: Address not mapped (1)*
> *# [excess:98681] Failing at address: 0x7f*
> *# [excess:98681] *** End of error message 
> #
> --
> # Primary job  terminated normally, but 1 process returned
> # a non-zero exit code. Per user-direction, the job has been aborted.
> #
> --
> #
> --
> # mpiexec noticed that process rank 0 with PID 0 on node excess exited on
> signal 11 (Segmentation fault: 11).
> #
> --
>  ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
>
> I see the same segfault error in all PETSc examples.
> Any help is mostly appreciated, I'm starting to work with PETSc. Our plan
> is to use the linear solver from PETSc for the Poisson equation on our
> numerical scheme and test this on a GPU cluster. So also, any guideline on
> how to interface PETSc with a fortran code and personal experience is also
> most appreciated!
>
> Marcos
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-15 Thread Matthew Knepley
On Mon, May 15, 2023 at 9:55 AM Zongze Yang  wrote:

> On Mon, 15 May 2023 at 17:24, Matthew Knepley  wrote:
>
>> On Sun, May 14, 2023 at 7:23 PM Zongze Yang  wrote:
>>
>>> Could you try to project the coordinates into the continuity space by
>>> enabling the option
>>> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?
>>>
>>
>> There is a comment in the code about that:
>>
>>   /* XXX FIXME Requires DMPlexSetClosurePermutationLexicographic() */
>>
>> So what is currently done is you project into the discontinuous space
>> from the GMsh coordinates,
>> and then we get the continuous coordinates from those later. This is why
>> we get the right answer.
>>
>>
> Sorry, I'm having difficulty understanding the comment and fully
> understanding your intended meaning. Are you saying that we can only
> project the space to a discontinuous space?
>

For higher order simplices, because we do not have the mapping to the GMsh
order yet.


> Additionally, should we always set
> `dm_plex_gmsh_project_petscdualspace_lagrange_continuity` to false for
> high-order gmsh files?
>

This is done automatically if you do not override it.


> With the option set to `true`, I got the following error:
>

Yes, do not do that.

  Thanks,

 Matt


> ```
> $ $PETSC_DIR/$PETSC_ARCH/tests/dm/impls/plex/tests/runex33_gmsh_3d_q2.sh
> -e "-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true"
> not ok dm_impls_plex_tests-ex33_gmsh_3d_q2 # Error code: 77
> #   Volume: 0.46875
> #   [0]PETSC ERROR: - Error Message
> --
> #   [0]PETSC ERROR: Petsc has generated inconsistent data
> #   [0]PETSC ERROR: Calculated volume 0.46875 != 1. actual volume
> (error 0.53125 > 1e-06 tol)
> #   [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
> shooting.
> #   [0]PETSC ERROR: Petsc Development GIT revision:
> v3.19.1-294-g9cc24bc9b93  GIT Date: 2023-05-15 12:07:10 +
> #   [0]PETSC ERROR: ../ex33 on a arch-linux-c-debug named AMA-PC-RA18
> by yzz Mon May 15 21:53:43 2023
> #   [0]PETSC ERROR: Configure options
> --CFLAGS=-I/opt/intel/oneapi/mkl/latest/include
> --CXXFLAGS=-I/opt/intel/oneapi/mkl/latest/include
> --LDFLAGS="-Wl,-rpath,/opt/intel/oneapi/mkl/latest/lib/intel64
> -L/opt/intel/oneapi/mkl/latest/lib/intel64" --download-bison
> --download-chaco --download-cmake
> --download-eigen="/home/yzz/firedrake/complex-int32-mkl-X-debug/src/eigen-3.3.3.tgz
> " --download-fftw --download-hdf5 --download-hpddm --download-hwloc
> --download-libpng --download-metis --download-mmg --download-mpich
> --download-mumps --download-netcdf --download-p4est --download-parmmg
> --download-pastix --download-pnetcdf --download-ptscotch
> --download-scalapack --download-slepc --download-suitesparse
> --download-superlu_dist --download-tetgen --download-triangle
> --with-blaslapack-dir=/opt/intel/oneapi/mkl/latest --with-c2html=0
> --with-debugging=1 --with-fortran-bindings=0
> --with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/latest
> --with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/latest
> --with-scalar-type=complex --with-shared-libraries=1 --with-x=1 --with-zlib
> PETSC_ARCH=arch-linux-c-debug
> #   [0]PETSC ERROR: #1 CheckVolume() at
> /home/yzz/opt/petsc/src/dm/impls/plex/tests/ex33.c:246
> #   [0]PETSC ERROR: #2 main() at
> /home/yzz/opt/petsc/src/dm/impls/plex/tests/ex33.c:261
> #   [0]PETSC ERROR: PETSc Option Table entries:
> #   [0]PETSC ERROR: -coord_space 0 (source: command line)
> #   [0]PETSC ERROR: -dm_plex_filename
> /home/yzz/opt/petsc/share/petsc/datafiles/meshes/cube_q2.msh (source:
> command line)
> #   [0]PETSC ERROR: -dm_plex_gmsh_project (source: command line)
> #   [0]PETSC ERROR:
> -dm_plex_gmsh_project_petscdualspace_lagrange_continuity true (source:
> command line)
> #   [0]PETSC ERROR: -tol 1e-6 (source: command line)
> #   [0]PETSC ERROR: -volume 1.0 (source: command line)
> #   [0]PETSC ERROR: End of Error Message ---send
> entire error message to petsc-ma...@mcs.anl.gov--
> #   application called MPI_Abort(MPI_COMM_SELF, 77) - process 0
>  ok dm_impls_plex_tests-ex33_gmsh_3d_q2 # SKIP Command failed so no diff
> ```
>
>
> Best wishes,
> Zongze
>
>   Thanks,
>>
>>  Matt
>>
>>
>>> Best wishes,
>>> Zongze
>>>
>>>
>>> On Mon, 15 May 2023 at 04:24, Matthew Knepley  wrote:
>>>
 On Sun, May 14, 2023 at 12:27 PM Zongze Yang 
 wrote:

>
>
>
> On Sun, 14 May 2023 at 23:54, Matthew Knepley 
> wrote:
>
>> On Sun, May 14, 2023 at 9:21 AM Zongze Yang 
>> wrote:
>>
>>> Hi, Matt,
>>>
>>> The issue has been resolved while testing on the latest version of
>>> PETSc. It seems that the problem has been fixed in the following merge
>>> request:  https://gitlab.com/petsc/petsc/-/merge_requests/5970
>>>
>>
>> No problem. Glad it is working.
>>
>>
>>> I 

[petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI

2023-05-15 Thread Vanella, Marcos (Fed) via petsc-users
Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 
4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX 
Ventura 13.3.1.
I can compile PETSc in debug mode with this configure and make lines. I can run 
the PETSC tests, which seem fine.
When I compile the library in optimized mode, either using -O3 or O1, for 
example configuring with:

$ ./configure --prefix=/opt/petsc-oneapi22u3 
--with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g 
-diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' 
FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 
--with-shared-libraries=0 --download-make

and using mpicc (icc), mpif90 (ifort) from  Open MPI, the static lib compiles. 
Yet, I see right off the bat this segfault error in the first PETSc example:

$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 
PETSC_ARCH=arch-darwin-c-opt test
/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make 
--no-print-directory -f 
/Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test 
PETSC_ARCH=arch-darwin-c-opt 
PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
/opt/intel/oneapi/intelpython/latest/bin/python3 
/Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py 
--petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 
--petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt 
PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
 CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
In file included from 
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
 from 
/Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning 
#2621: attribute "warn_unused_result" does not apply here
  PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^

CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
   TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
not ok sys_classes_draw_tests-ex1_1 # Error code: 139
# [excess:98681] *** Process received signal ***
# [excess:98681] Signal: Segmentation fault: 11 (11)
# [excess:98681] Signal code: Address not mapped (1)
# [excess:98681] Failing at address: 0x7f
# [excess:98681] *** End of error message ***
# --
# Primary job  terminated normally, but 1 process returned
# a non-zero exit code. Per user-direction, the job has been aborted.
# --
# --
# mpiexec noticed that process rank 0 with PID 0 on node excess exited on 
signal 11 (Segmentation fault: 11).
# --
 ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff

I see the same segfault error in all PETSc examples.
Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is to 
use the linear solver from PETSc for the Poisson equation on our numerical 
scheme and test this on a GPU cluster. So also, any guideline on how to 
interface PETSc with a fortran code and personal experience is also most 
appreciated!

Marcos





Re: [petsc-users] problem for using PCBDDC

2023-05-15 Thread Stefano Zampini
BDDC is a domain decomposition solver of the non-overlapping type and
cannot be used on assembled operators.
If you want to use it, you need to restructure your code a bit.

I presume from your message that your current approach is

1) generate_assembled_csr
2) decompose_csr? or decompose_mesh?
3) get_subdomain_relevant_entries
4) set in local matrix

this is wrong since you are summing up redundant matrix values in the final
MATIS format (you can check that MatMultEqual return false if you compare
the assembled operator and the MATIS operator)

You should restructure your code as

1) decompose mesh
2) generate_csr_only_for_local_subdomain
3) set values in local ordering into the MATIS object

you can start with a simple 2 cells problem, each assigned to a different
process to understand how to move forward.

You can play with src/ksp/ksp/tutorials/ex71.c which uses a structured grid
to understand how to setup a MATIS for a  PDE solve

Hope this helps


Il giorno lun 15 mag 2023 alle ore 17:08 ziming xiong <
xiongziming2...@gmail.com> ha scritto:

> Hello sir,
> I am a PhD student and am trying to use the PCBDDC method in petsc to
> solve the matrix, but the final result is wrong. So I would like to ask you
> a few questions.
> First I will describe the flow of my code, I first used the finite element
> method to build the total matrix in CSR format (total boundary conditions
> have been imposed), where I did not build the total matrix, but only the
> parameters ia, ja,value in CSR format, through which the parameters of the
> metis (xadj, adjncy) are derived. The matrix is successfully divided into 2
> subdomains using metis. After getting the global index of the points of
> each subdomain by the part parameter of metis. I apply
> ISLocalToGlobalMappingCreate to case mapping and use
> ISGlobalToLocalMappingApply to convert the global index of points within
> each process to local index and use MatSetValueLocal to populate the
> corresponding subdomain matrix for each process. Here I am missing the
> relationship of the boundary points between subdomains, and by using
> ISGlobalToLocalMappingApply (I use IS_GTOLM_MASK to get the points outside
> the subdomains converted to -1) I can get the index of the missing
> relationship in the global matrix as well as the value. After creating the
> global MATIS use MatISSetLocalMat to synchronize the subdomain matrix to
> the global MATIS. After using MatSetValues to add the relationship of the
> boundary points between subdomains into the global MATIS. The final
> calculation is performed, but the final result is not correct.
> My question is:
> 1. in PetscCall(MatAssemblyBegin(matIS, MAT_FINAL_ASSEMBLY)).
> PetscCall(MatAssemblyEnd(matIS, MAT_FINAL_ASSEMBLY)).
> After that, when viewing the matrix by
> PetscCall(MatView(matIS,PETSC_VIEWER_STDOUT_WORLD));, each process will
> output the non-zero items of the matrix separately, but this index is the
> local index is this normal?
> 2. I found that after using MatSetValues to add the relationship of
> boundary points between subdomains into the global MATIS, the calculation
> result does not change. Why is this? Can I interpolate directly into the
> global MATIS if I know the global matrix index of the missing relations?
>
>
> Best regards,
> Ziming XIONG
>
>


-- 
Stefano


[petsc-users] problem for using PCBDDC

2023-05-15 Thread ziming xiong
Hello sir,
I am a PhD student and am trying to use the PCBDDC method in petsc to solve
the matrix, but the final result is wrong. So I would like to ask you a few
questions.
First I will describe the flow of my code, I first used the finite element
method to build the total matrix in CSR format (total boundary conditions
have been imposed), where I did not build the total matrix, but only the
parameters ia, ja,value in CSR format, through which the parameters of the
metis (xadj, adjncy) are derived. The matrix is successfully divided into 2
subdomains using metis. After getting the global index of the points of
each subdomain by the part parameter of metis. I apply
ISLocalToGlobalMappingCreate to case mapping and use
ISGlobalToLocalMappingApply to convert the global index of points within
each process to local index and use MatSetValueLocal to populate the
corresponding subdomain matrix for each process. Here I am missing the
relationship of the boundary points between subdomains, and by using
ISGlobalToLocalMappingApply (I use IS_GTOLM_MASK to get the points outside
the subdomains converted to -1) I can get the index of the missing
relationship in the global matrix as well as the value. After creating the
global MATIS use MatISSetLocalMat to synchronize the subdomain matrix to
the global MATIS. After using MatSetValues to add the relationship of the
boundary points between subdomains into the global MATIS. The final
calculation is performed, but the final result is not correct.
My question is:
1. in PetscCall(MatAssemblyBegin(matIS, MAT_FINAL_ASSEMBLY)).
PetscCall(MatAssemblyEnd(matIS, MAT_FINAL_ASSEMBLY)).
After that, when viewing the matrix by
PetscCall(MatView(matIS,PETSC_VIEWER_STDOUT_WORLD));, each process will
output the non-zero items of the matrix separately, but this index is the
local index is this normal?
2. I found that after using MatSetValues to add the relationship of
boundary points between subdomains into the global MATIS, the calculation
result does not change. Why is this? Can I interpolate directly into the
global MATIS if I know the global matrix index of the missing relations?


Best regards,
Ziming XIONG


Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-15 Thread Zongze Yang
On Mon, 15 May 2023 at 17:24, Matthew Knepley  wrote:

> On Sun, May 14, 2023 at 7:23 PM Zongze Yang  wrote:
>
>> Could you try to project the coordinates into the continuity space by
>> enabling the option
>> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?
>>
>
> There is a comment in the code about that:
>
>   /* XXX FIXME Requires DMPlexSetClosurePermutationLexicographic() */
>
> So what is currently done is you project into the discontinuous space from
> the GMsh coordinates,
> and then we get the continuous coordinates from those later. This is why
> we get the right answer.
>
>
Sorry, I'm having difficulty understanding the comment and fully
understanding your intended meaning. Are you saying that we can only
project the space to a discontinuous space?

Additionally, should we always set
`dm_plex_gmsh_project_petscdualspace_lagrange_continuity` to false for
high-order gmsh files?

With the option set to `true`, I got the following error:
```
$ $PETSC_DIR/$PETSC_ARCH/tests/dm/impls/plex/tests/runex33_gmsh_3d_q2.sh -e
"-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true"
not ok dm_impls_plex_tests-ex33_gmsh_3d_q2 # Error code: 77
#   Volume: 0.46875
#   [0]PETSC ERROR: - Error Message
--
#   [0]PETSC ERROR: Petsc has generated inconsistent data
#   [0]PETSC ERROR: Calculated volume 0.46875 != 1. actual volume
(error 0.53125 > 1e-06 tol)
#   [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
shooting.
#   [0]PETSC ERROR: Petsc Development GIT revision:
v3.19.1-294-g9cc24bc9b93  GIT Date: 2023-05-15 12:07:10 +
#   [0]PETSC ERROR: ../ex33 on a arch-linux-c-debug named AMA-PC-RA18
by yzz Mon May 15 21:53:43 2023
#   [0]PETSC ERROR: Configure options
--CFLAGS=-I/opt/intel/oneapi/mkl/latest/include
--CXXFLAGS=-I/opt/intel/oneapi/mkl/latest/include
--LDFLAGS="-Wl,-rpath,/opt/intel/oneapi/mkl/latest/lib/intel64
-L/opt/intel/oneapi/mkl/latest/lib/intel64" --download-bison
--download-chaco --download-cmake
--download-eigen="/home/yzz/firedrake/complex-int32-mkl-X-debug/src/eigen-3.3.3.tgz
" --download-fftw --download-hdf5 --download-hpddm --download-hwloc
--download-libpng --download-metis --download-mmg --download-mpich
--download-mumps --download-netcdf --download-p4est --download-parmmg
--download-pastix --download-pnetcdf --download-ptscotch
--download-scalapack --download-slepc --download-suitesparse
--download-superlu_dist --download-tetgen --download-triangle
--with-blaslapack-dir=/opt/intel/oneapi/mkl/latest --with-c2html=0
--with-debugging=1 --with-fortran-bindings=0
--with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/latest
--with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/latest
--with-scalar-type=complex --with-shared-libraries=1 --with-x=1 --with-zlib
PETSC_ARCH=arch-linux-c-debug
#   [0]PETSC ERROR: #1 CheckVolume() at
/home/yzz/opt/petsc/src/dm/impls/plex/tests/ex33.c:246
#   [0]PETSC ERROR: #2 main() at
/home/yzz/opt/petsc/src/dm/impls/plex/tests/ex33.c:261
#   [0]PETSC ERROR: PETSc Option Table entries:
#   [0]PETSC ERROR: -coord_space 0 (source: command line)
#   [0]PETSC ERROR: -dm_plex_filename
/home/yzz/opt/petsc/share/petsc/datafiles/meshes/cube_q2.msh (source:
command line)
#   [0]PETSC ERROR: -dm_plex_gmsh_project (source: command line)
#   [0]PETSC ERROR:
-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true (source:
command line)
#   [0]PETSC ERROR: -tol 1e-6 (source: command line)
#   [0]PETSC ERROR: -volume 1.0 (source: command line)
#   [0]PETSC ERROR: End of Error Message ---send
entire error message to petsc-ma...@mcs.anl.gov--
#   application called MPI_Abort(MPI_COMM_SELF, 77) - process 0
 ok dm_impls_plex_tests-ex33_gmsh_3d_q2 # SKIP Command failed so no diff
```


Best wishes,
Zongze

  Thanks,
>
>  Matt
>
>
>> Best wishes,
>> Zongze
>>
>>
>> On Mon, 15 May 2023 at 04:24, Matthew Knepley  wrote:
>>
>>> On Sun, May 14, 2023 at 12:27 PM Zongze Yang 
>>> wrote:
>>>



 On Sun, 14 May 2023 at 23:54, Matthew Knepley 
 wrote:

> On Sun, May 14, 2023 at 9:21 AM Zongze Yang 
> wrote:
>
>> Hi, Matt,
>>
>> The issue has been resolved while testing on the latest version of
>> PETSc. It seems that the problem has been fixed in the following merge
>> request:  https://gitlab.com/petsc/petsc/-/merge_requests/5970
>>
>
> No problem. Glad it is working.
>
>
>> I sincerely apologize for any inconvenience caused by my previous
>> message. However, I would like to provide you with additional information
>> regarding the test files. Attached to this email, you will find two Gmsh
>> files: "square_2rd.msh" and "square_3rd.msh." These files contain
>> high-order triangulated mesh data for the unit square.
>>
>> ```
>> $ ./ex33 -coord_space 0 -dm_plex_filename 

Re: [petsc-users] Issues creating DMPlex from higher order mesh generated by gmsh

2023-05-15 Thread Matthew Knepley
On Mon, May 15, 2023 at 9:30 AM Jed Brown  wrote:

> Matthew Knepley  writes:
>
> > On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
> > petsc-users@mcs.anl.gov> wrote:
> >
> >> Hi.
> >>
> >>
> >> I'm trying to read a mesh of higher element order, in this example a
> mesh
> >> consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But
> It
> >> looks like the mesh is not properly being loaded and converted into a
> >> DMPlex. gmsh tells me it has generated a mesh with 7087 nodes, but when
> I
> >> view my dm object it tells me it has 1081 0-cells. This is the printout
> I
> >> get
> >>
> >
> > Hi Vilmer,
> >
> > Plex makes a distinction between topological entities, like vertices,
> edges
> > and cells, and the function spaces used to represent fields, like
> velocity
> > or coordinates. When formats use "nodes", they mix the two concepts
> > together.
> >
> > You see that if you add the number of vertices and edges, you get 7087,
> > since for P2 there is a "node" on every edge. Is anything else wrong?
>
> Note that quadratic (and higher order) tets are broken with the Gmsh
> reader. It's been on my todo list for a while.
>
> As an example, this works when using linear elements (the projection makes
> them quadratic and visualization is correct), but is tangled when holes.msh
> is quadratic.
>
> $ $PETSC_ARCH/tests/dm/impls/plex/tutorials/ex1 -dm_plex_filename
> ~/meshes/holes.msh -dm_view cgns:s.cgns -dm_coord_petscspace_degree 2
>

Projection to the continuous space is broken because we do not have the
lexicographic order on simplicies done. Are you sure you are projecting
into the broken space?

  Thanks,

 Matt
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Issues creating DMPlex from higher order mesh generated by gmsh

2023-05-15 Thread Jed Brown
Matthew Knepley  writes:

> On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hi.
>>
>>
>> I'm trying to read a mesh of higher element order, in this example a mesh
>> consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But It
>> looks like the mesh is not properly being loaded and converted into a
>> DMPlex. gmsh tells me it has generated a mesh with 7087 nodes, but when I
>> view my dm object it tells me it has 1081 0-cells. This is the printout I
>> get
>>
>
> Hi Vilmer,
>
> Plex makes a distinction between topological entities, like vertices, edges
> and cells, and the function spaces used to represent fields, like velocity
> or coordinates. When formats use "nodes", they mix the two concepts
> together.
>
> You see that if you add the number of vertices and edges, you get 7087,
> since for P2 there is a "node" on every edge. Is anything else wrong?

Note that quadratic (and higher order) tets are broken with the Gmsh reader. 
It's been on my todo list for a while.

As an example, this works when using linear elements (the projection makes them 
quadratic and visualization is correct), but is tangled when holes.msh is 
quadratic.

$ $PETSC_ARCH/tests/dm/impls/plex/tutorials/ex1 -dm_plex_filename 
~/meshes/holes.msh -dm_view cgns:s.cgns -dm_coord_petscspace_degree 2

SetFactory("OpenCASCADE");

radius = 0.3;
DefineConstant[
  nx = {1, Min 1, Max 30, Step 1, Name "Parameters/nx"}
  ny = {1, Min 1, Max 30, Step 1, Name "Parameters/ny"}
  extrude_length = {1, Min .1, Max 10, Step .1, Name "Parameters/extrusion 
length"}
  extrude_layers = {10, Min 1, Max 100, Step 1, Name "Parameters/extrusion 
layers"}
];
N = nx * ny;
Rectangle(1) = {0, 0, 0, 1, 1, 0};
Disk(10) = {0.5, 0.5, 0, radius};
BooleanDifference(100) = {Surface{1}; Delete;}{Surface{10}; Delete;};

For i In {0:nx-1}
  For j In {0:ny-1}
If (i + j > 0)
   Translate {i, j, 0} { Duplicata { Surface{100}; } }
EndIf
  EndFor
EndFor

Coherence;

// All the straight edges should have 8 elements
Transfinite Curve {:} = 8+1;

// Select the circles
circles = {};
For i In {0:nx-1}
  For j In {0:ny-1}
lo = .5 - radius - 1e-4;
hi = .5 + radius + 1e-4;
circles() += Curve In BoundingBox {
  i + lo, j + lo, -1e-4,
  i + hi, j + hi, +1e-4
};
  EndFor
EndFor

// Circles need 16 elements
Transfinite Curve {circles()} = 16+1;

Mesh.Algorithm = 8;

Extrude {0, 0, extrude_length} { Surface{100:100+N-1}; Layers{extrude_layers}; }

e = 1e-4;
extrude_start() = Surface In BoundingBox {-e, -e, -e, nx+e, ny+e, e};
extrude_end() = Surface In BoundingBox {
  -e, -e, extrude_length-e,
  nx+e, ny+e, extrude_length+e};

Physical Surface("start") = {extrude_start()};
Physical Surface("end") = {extrude_end()};
Physical Volume("solid") = {1:N};


Re: [petsc-users] Issues creating DMPlex from higher order mesh generated by gmsh

2023-05-15 Thread Matthew Knepley
On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Hi.
>
>
> I'm trying to read a mesh of higher element order, in this example a mesh
> consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But It
> looks like the mesh is not properly being loaded and converted into a
> DMPlex. gmsh tells me it has generated a mesh with 7087 nodes, but when I
> view my dm object it tells me it has 1081 0-cells. This is the printout I
> get
>

Hi Vilmer,

Plex makes a distinction between topological entities, like vertices, edges
and cells, and the function spaces used to represent fields, like velocity
or coordinates. When formats use "nodes", they mix the two concepts
together.

You see that if you add the number of vertices and edges, you get 7087,
since for P2 there is a "node" on every edge. Is anything else wrong?

  Thanks,

 Matt


> ...
>
>
> Info: Done meshing order 2 (Wall 0.0169823s, CPU 0.016662s)
>
> Info: 7087 nodes 5838 elements
>
> ...
>
> DM Object: DM_0x8400_0 1 MPI process
>   type: plex
> DM_0x8400_0 in 3 dimensions:
>   Number of 0-cells per rank: 1081
>   Number of 1-cells per rank: 6006
>   Number of 2-cells per rank: 9104
>   Number of 3-cells per rank: 4178
> Labels:
>   celltype: 4 strata with value/size (0 (1081), 6 (4178), 3 (9104), 1
> (6006))
>   depth: 4 strata with value/size (0 (1081), 1 (6006), 2 (9104), 3 (4178))
>   Cell Sets: 1 strata with value/size (2 (4178))
>   Face Sets: 6 strata with value/size (12 (190), 21 (242), 20 (242), 11
> (192), 22 (242), 10 (188))
> Field P2:
>   adjacency FEM
> ...
>
>
> To replicate the error try generating a mesh according to
>
>
> https://gmsh.info/doc/texinfo/gmsh.html#t5
>
>
> 
>
> setting the element order to 2, and then loading the mesh using
>
>
> DMPlexCreateGmshFromFile
>
>
> I don't have any issues when i set the element order to 1.
>
>
> Thanks in advance,
>
> Vilmer
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-15 Thread Matthew Knepley
On Sun, May 14, 2023 at 7:23 PM Zongze Yang  wrote:

> Could you try to project the coordinates into the continuity space by
> enabling the option
> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?
>

There is a comment in the code about that:

  /* XXX FIXME Requires DMPlexSetClosurePermutationLexicographic() */

So what is currently done is you project into the discontinuous space from
the GMsh coordinates,
and then we get the continuous coordinates from those later. This is why we
get the right answer.

  Thanks,

 Matt


> Best wishes,
> Zongze
>
>
> On Mon, 15 May 2023 at 04:24, Matthew Knepley  wrote:
>
>> On Sun, May 14, 2023 at 12:27 PM Zongze Yang 
>> wrote:
>>
>>>
>>>
>>>
>>> On Sun, 14 May 2023 at 23:54, Matthew Knepley  wrote:
>>>
 On Sun, May 14, 2023 at 9:21 AM Zongze Yang 
 wrote:

> Hi, Matt,
>
> The issue has been resolved while testing on the latest version of
> PETSc. It seems that the problem has been fixed in the following merge
> request:  https://gitlab.com/petsc/petsc/-/merge_requests/5970
>

 No problem. Glad it is working.


> I sincerely apologize for any inconvenience caused by my previous
> message. However, I would like to provide you with additional information
> regarding the test files. Attached to this email, you will find two Gmsh
> files: "square_2rd.msh" and "square_3rd.msh." These files contain
> high-order triangulated mesh data for the unit square.
>
> ```
> $ ./ex33 -coord_space 0 -dm_plex_filename square_2rd.msh
> -dm_plex_gmsh_project
> -dm_plex_gmsh_project_petscdualspace_lagrange_continuity true
> -dm_plex_gmsh_project_fe_view -volume 1
> PetscFE Object: P2 1 MPI process
>   type: basic
>   Basic Finite Element in 2 dimensions with 2 components
>   PetscSpace Object: P2 1 MPI process
> type: sum
> Space in 2 variables with 2 components, size 12
> Sum space of 2 concatenated subspaces (all identical)
>   PetscSpace Object: sum component (sumcomp_) 1 MPI process
> type: poly
> Space in 2 variables with 1 components, size 6
> Polynomial space of degree 2
>   PetscDualSpace Object: P2 1 MPI process
> type: lagrange
> Dual space with 2 components, size 12
> Continuous Lagrange dual space
> Quadrature on a triangle of order 5 on 9 points (dim 2)
> Volume: 1.
> $ ./ex33 -coord_space 0 -dm_plex_filename square_3rd.msh
> -dm_plex_gmsh_project
> -dm_plex_gmsh_project_petscdualspace_lagrange_continuity true
> -dm_plex_gmsh_project_fe_view -volume 1
> PetscFE Object: P3 1 MPI process
>   type: basic
>   Basic Finite Element in 2 dimensions with 2 components
>   PetscSpace Object: P3 1 MPI process
> type: sum
> Space in 2 variables with 2 components, size 20
> Sum space of 2 concatenated subspaces (all identical)
>   PetscSpace Object: sum component (sumcomp_) 1 MPI process
> type: poly
> Space in 2 variables with 1 components, size 10
> Polynomial space of degree 3
>   PetscDualSpace Object: P3 1 MPI process
> type: lagrange
> Dual space with 2 components, size 20
> Continuous Lagrange dual space
> Quadrature on a triangle of order 7 on 16 points (dim 2)
> Volume: 1.
> ```
>
> Thank you for your attention and understanding. I apologize once again
> for my previous oversight.
>

 Great! If you make an MR for this, you will be included on the next
 list of PETSc contributors. Otherwise, I can do it.


>>> I appreciate your offer to handle the MR. Please go ahead and take care
>>> of it. Thank you!
>>>
>>
>> I have created the MR with your tests. They are working for me:
>>
>>   https://gitlab.com/petsc/petsc/-/merge_requests/6463
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Best Wishes,
>>> Zongze
>>>
>>>
   Thanks,

  Matt


> Best wishes,
> Zongze
>
>
> On Sun, 14 May 2023 at 16:44, Matthew Knepley 
> wrote:
>
>> On Sat, May 13, 2023 at 6:08 AM Zongze Yang 
>> wrote:
>>
>>> Hi, Matt,
>>>
>>> There seem to be ongoing issues with projecting high-order
>>> coordinates from a gmsh file to other spaces. I would like to inquire
>>> whether there are any plans to resolve this problem.
>>>
>>> Thank you for your attention to this matter.
>>>
>>
>> Yes, I will look at it. The important thing is to have a good test.
>> Here are the higher order geometry tests
>>
>>
>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tests/ex33.c
>>
>> I take shapes with known volume, mesh them with higher order
>> geometry, and look at the convergence to the true volume. Could you add a
>> GMsh test, meaning the .msh file and known volume, and I