Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-03 Thread Satish Balay via petsc-users
With xcode-15.3 and branch "barry/2024-04-03/fix-chaco-modern-c/release" from 
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7433__;!!G_uCfscf7eWS!YJPSyG4qeGbCKYRp9y16HJgjw7AOrQ0mL0QWb_XcKYZ17UwK2GtURGMpkyi4TctAY-8XqSvQUFmyCQFNnKy75fI$
  [and a patched openmpi tarball to remove -Wl,-commons,use_dylibs] the 
following works for me.

Satish



petsc@mpro petsc.x % ./configure --download-bison --download-chaco 
--download-ctetgen --download-eigen --download-fftw --download-hdf5 
--download-hpddm --download-hwloc --download-hypre --download-libpng 
--download-metis --download-mmg --download-mumps --download-netcdf 
--download-openblas --download-openblas-make-options="'USE_THREAD=0 
USE_LOCKING=1 USE_OPENMP=0'" --download-p4est --download-parmmg 
--download-pnetcdf --download-pragmatic --download-ptscotch 
--download-scalapack --download-slepc --download-suitesparse 
--download-superlu_dist --download-tetgen --download-triangle --with-c2html=0 
--with-debugging=1 --with-fortran-bindings=0 --with-shared-libraries=1 
--with-x=0 --with-zlib 
--download-openmpi=https://urldefense.us/v3/__https://web.cels.anl.gov/projects/petsc/download/externalpackages/openmpi-5.0.2-xcode15.tar.gz__;!!G_uCfscf7eWS!YJPSyG4qeGbCKYRp9y16HJgjw7AOrQ0mL0QWb_XcKYZ17UwK2GtURGMpkyi4TctAY-8XqSvQUFmyCQFNvoG1gVM$
  --download-pastix && make && make check

  CC arch-darwin-c-debug/obj/src/lme/interface/lmesolve.o
 CLINKER arch-darwin-c-debug/lib/libslepc.3.21.0.dylib
DSYMUTIL arch-darwin-c-debug/lib/libslepc.3.21.0.dylib
Now to install the library do:
make 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc 
PETSC_DIR=/Users/petsc/petsc.x install
=
*** Installing SLEPc ***
*** Installing SLEPc at prefix location: 
/Users/petsc/petsc.x/arch-darwin-c-debug  ***

Install complete.
Now to check if the libraries are working do (in current directory):
make SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug 
PETSC_DIR=/Users/petsc/petsc.x PETSC_ARCH=arch-darwin-c-debug check

/usr/bin/make --no-print-directory -f makefile PETSC_ARCH=arch-darwin-c-debug 
PETSC_DIR=/Users/petsc/petsc.x 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc 
install-builtafterslepc
/usr/bin/make --no-print-directory -f makefile PETSC_ARCH=arch-darwin-c-debug 
PETSC_DIR=/Users/petsc/petsc.x 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc 
slepc4py-install
make[6]: Nothing to be done for `slepc4py-install'.
*** Building and installing HPDDM ***
=
Now to check if the libraries are working do:
make PETSC_DIR=/Users/petsc/petsc.x PETSC_ARCH=arch-darwin-c-debug check
=
Running PETSc check examples to verify correct installation
Using PETSC_DIR=/Users/petsc/petsc.x and PETSC_ARCH=arch-darwin-c-debug
C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes
C/C++ example src/snes/tutorials/ex19 run successfully with HYPRE
C/C++ example src/snes/tutorials/ex19 run successfully with MUMPS
C/C++ example src/snes/tutorials/ex19 run successfully with SuiteSparse
C/C++ example src/snes/tutorials/ex19 run successfully with SuperLU_DIST
C/C++ example src/vec/vec/tests/ex47 run successfully with HDF5
Running SLEPc check examples to verify correct installation
Using 
SLEPC_DIR=/Users/petsc/petsc.x/arch-darwin-c-debug/externalpackages/git.slepc, 
PETSC_DIR=/Users/petsc/petsc.x, and PETSC_ARCH=arch-darwin-c-debug
C/C++ example src/eps/tests/test10 run successfully with 1 MPI process
C/C++ example src/eps/tests/test10 run successfully with 2 MPI processes
Completed SLEPc check examples
Completed PETSc check examples
petsc@mpro petsc.x % clang --version
Apple clang version 15.0.0 (clang-1500.3.9.4)
Target: arm64-apple-darwin23.4.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
petsc@mpro petsc.x % 


On Tue, 2 Apr 2024, Zongze Yang wrote:

> Thank you for the suggestion.
> 
> I'd like to share some test results using the current Xcode. When I added the 
> flag `LDFLAGS=-Wl,-ld_classic` and configured PETSc with OpenMPI, the tests 
> with the latest Xcode seemed okay, except for some link warnings. The 
> configure 
> command is
> ```
> ./configure \
> PETSC_ARCH=arch-darwin-c-debug-openmpi \
> LDFLAGS=-Wl,-ld_classic \
> 
> --download-openmpi=https://urldefense.us/v3/__https://download.open-mpi.org/release/open-mpi/v5.0/openmpi-5.0.3rc1.tar.bz2__;!!G_uCfscf7eWS!eGiVH2meEkLSEHvkY6Y-m7U1wPG4ZDxHod7lLZI3HTu6itzNEDm7n3cz4GNly925EEHvVRnyNQYn2aAt0ewiXz99$
>  
> \
> --download-mumps --download-scalapack \
> --with-clean \
> && make && make check
> ```


Re: [petsc-users] Correct way to set/track global numberings in DMPlex?

2024-04-03 Thread Jed Brown




 Matthew Knepley  writes: >> I'm developing routines that will read/write CGNS files to DMPlex and vice >> versa. >> One of the recurring challenges is the bookkeeping of global numbering for >>




ZjQcmQRYFpfptBannerStart




  

  
	This Message Is From an External Sender
  
  
This message came from outside your organization.
  



 
  


ZjQcmQRYFpfptBannerEnd




Matthew Knepley  writes:

>> I'm developing routines that will read/write CGNS files to DMPlex and vice
>> versa.
>> One of the recurring challenges is the bookkeeping of global numbering for
>> vertices and cells.
>> Currently, I am restricting my support to single Zone CGNS files, in which
>> the file provides global numbers for vertices and cells.
>>
>
> I thought Jed had put in parallel CGNS loading. If so, maybe you can
> transition to that. If not, we should get your stuff integrated.

We did implement this, but it's still in a branch. It's been on my plate for a while and I thought I would get to clean up that branch for merging this week, but that will have to wait for next week. (The current hang-up is mediating globally consistent edge orientation for isoperiodic meshes, as needed for cubic elements.)

How are you handling boundary conditions so far? Do you support periodicity?



Re: [petsc-users] Correct way to set/track global numberings in DMPlex?

2024-04-03 Thread Matthew Knepley
On Wed, Apr 3, 2024 at 2:28 PM Ferrand, Jesus A. 
wrote:

> Dear PETSc team: (Hoping to get a hold of Jed and/or Matt for this one)
> (Also, sorry for the mouthful below) I'm developing routines that will
> read/write CGNS files to DMPlex and vice versa. One of the recurring
> challenges is the bookkeeping
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>
> ZjQcmQRYFpfptBannerEnd
> Dear PETSc team:
>
> (Hoping to get a hold of Jed and/or Matt for this one)
> (Also, sorry for the mouthful below)
>
> I'm developing routines that will read/write CGNS files to DMPlex and vice
> versa.
> One of the recurring challenges is the bookkeeping of global numbering for
> vertices and cells.
> Currently, I am restricting my support to single Zone CGNS files, in which
> the file provides global numbers for vertices and cells.
>

I thought Jed had put in parallel CGNS loading. If so, maybe you can
transition to that. If not, we should get your stuff integrated.


> I used PetscHSetI as exemplified in DMPlexBuildFromCellListParallel() to
> obtain local DAG numbers from the global numbers provided by the file.
> I also used PetscSFCreateByMatchingIndices() to establish a basic DAG
> point distribution over the MPI processes.
> I use this PointSF to manually assemble a global PetscSection.
> For owned DAG points  (per the PointSF) , I call
> PetscSectionSetOffset(section, point, file_offset);
> For ghost DAG points  (per the PointSF) I call
> PetscSectionSetOffset(section, point, -(file_offset + 1));
>

This sounds alright to me, although I admit to not understanding exactly
what is being done.

All of what I have just described happens in my CGNS version of
> DMPlexTopologyLoad().
> My intention is to retain those numbers into the DMPlex, and reuse them in
> my CGNS analogues of DMPlexCoordinatesLoad(), DMPlexLabelsLoad(), and
> DMPlexGlobalVectorLoad().
> Anyhow, is this a good wait to track global numbers?
>

The way I think about it, for example in DMPlexBuildFromCellListParallel(),
you load all parallel things in blocks (cell adjacency, vertex coordinates,
etc). Then if you have to redistribute afterwards, you make a PetscSF to do
it. I first make one mapping points to points. With a PetscSection, you can
easily convert this into  dofs to dofs. For example, we load sets of
vertices, but we want vertices distributed as they are attached to cells.
So we create a PetscSF mapping uniform blocks to the division attached to
cells. Then we use the PetscSection for coordinates to make a new PetscSF
and redistribute coordinates.

  Thanks,

 Matt


> Also, I need (for other applications) to eventually call
> DMPlexInterpolate() and DMPlexDistribute(), will the global PetscSection
> offsets be preserved after calling those two?
>
>
> Sincerely:
>
> *J.A. Ferrand*
>
> Embry-Riddle Aeronautical University - Daytona Beach - FL
> Ph.D. Candidate, Aerospace Engineering
>
> M.Sc. Aerospace Engineering
>
> B.Sc. Aerospace Engineering
>
> B.Sc. Computational Mathematics
>
>
> *Phone:* (386)-843-1829
>
> *Email(s):* ferra...@my.erau.edu
>
> jesus.ferr...@gmail.com
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!Z30NkeNY4hgcjCs5RtVH3AgiBI0E4BBkGDPYdLNB10LWOF050wW1AXJDMcOtZ0G3u9nPiKrc0MX9YIYzdUyK$
  



[petsc-users] Correct way to set/track global numberings in DMPlex?

2024-04-03 Thread Ferrand, Jesus A.
Dear PETSc team:

(Hoping to get a hold of Jed and/or Matt for this one)
(Also, sorry for the mouthful below)

I'm developing routines that will read/write CGNS files to DMPlex and vice 
versa.
One of the recurring challenges is the bookkeeping of global numbering for 
vertices and cells.
Currently, I am restricting my support to single Zone CGNS files, in which the 
file provides global numbers for vertices and cells.

I used PetscHSetI as exemplified in DMPlexBuildFromCellListParallel() to obtain 
local DAG numbers from the global numbers provided by the file.
I also used PetscSFCreateByMatchingIndices() to establish a basic DAG point 
distribution over the MPI processes.
I use this PointSF to manually assemble a global PetscSection.
For owned DAG points  (per the PointSF) , I call PetscSectionSetOffset(section, 
point, file_offset);
For ghost DAG points  (per the PointSF) I call PetscSectionSetOffset(section, 
point, -(file_offset + 1));

All of what I have just described happens in my CGNS version of 
DMPlexTopologyLoad().
My intention is to retain those numbers into the DMPlex, and reuse them in my 
CGNS analogues of DMPlexCoordinatesLoad(), DMPlexLabelsLoad(), and 
DMPlexGlobalVectorLoad().
Anyhow, is this a good wait to track global numbers?
Also, I need (for other applications) to eventually call DMPlexInterpolate() 
and DMPlexDistribute(), will the global PetscSection offsets be preserved after 
calling those two?



Sincerely:

J.A. Ferrand

Embry-Riddle Aeronautical University - Daytona Beach - FL
Ph.D. Candidate, Aerospace Engineering

M.Sc. Aerospace Engineering

B.Sc. Aerospace Engineering

B.Sc. Computational Mathematics


Phone: (386)-843-1829

Email(s): ferra...@my.erau.edu

jesus.ferr...@gmail.com