Toby and Matt:

Thank you for your helpful replies.
In principle, I have what I need, however, I ran into a bug with PetscSFReduce.
When I run the following on the pointSF from a distributed plex (2 MPI ranks on 
a small mesh).

//==============================================================================================
PetscSFGetGraph(point_sf,&nroots,&nleaves,&ilocal,&iremote);
PetscCalloc2(nleaves,&leafdata,nroots,&rootdata);
\* Code that populates leafdata*/
PetscSFReduceBegin(point_sf,MPIU_INT,leafdata, rootdata,MPI_SUM);
PetscSFReduceEnd(point_sf,MPIU_INT,leafdata, rootdata,MPI_SUM);

PetscSFView(point_sf,0);
PetscViewerASCIIPrintf(PETSC_VIEWER_STDOUT_WORLD,"## Reduce Leafdata\n"); //I 
copied this from a PetscSF example.
PetscIntView(nleaves,leafdata,PETSC_VIEWER_STDOUT_WORLD);
PetscViewerASCIIPrintf(PETSC_VIEWER_STDOUT_WORLD,"## Reduce Rootdata\n");
PetscIntView(nroots,rootdata,PETSC_VIEWER_STDOUT_WORLD);
PetscFree2(leafdata,rootdata);
//==============================================================================================

.... I get the following printout :

//======================================
PetscSF Object: 2 MPI processes
  type: basic
  [0] Number of roots=29, leaves=5, remote ranks=1
  [0] 9 <- (1,9)
  [0] 11 <- (1,10)
  [0] 12 <- (1,13)
  [0] 20 <- (1,20)
  [0] 27 <- (1,27)
  [1] Number of roots=29, leaves=2, remote ranks=1
  [1] 14 <- (0,13)
  [1] 19 <- (0,18)
  MultiSF sort=rank-order
## Reduce Leafdata
[0] 0: 2 2 2 0 0
[1] 0: 3 0
## Reduce Rootdata
[0] 0: 0 0 0 0 0 0 0 0 0 0 0 0 0 -686563120 0 0 0 0 0 0
[0] 20: 0 0 0 0 0 0 0 0 0
[1] 0: 0 0 0 0 0 0 0 0 0 0 0 0 0 128 0 0 0 0 0 0
[1] 20: -527386800 0 0 0 0 0 0 32610 0
//======================================

The good news is that the rootdata on both processors has the correct number of 
nonzeros after reduction.
The bad news is that the nonzeros are garbage (like what one gets when a 
variable isn't initialized).
Any ideas as to what could cause this? Could something like a previous call to 
a PetscSF or DMPlex function do this?

I am still using PETSc version 3.16, but I looked at the patch notes of 3.17 
and did not see any updates on PetscSFReduce().
________________________________
From: Matthew Knepley <[email protected]>
Sent: Wednesday, May 18, 2022 2:09 AM
To: Toby Isaac <[email protected]>
Cc: Ferrand, Jesus A. <[email protected]>; [email protected] 
<[email protected]>
Subject: [EXTERNAL] Re: [petsc-users] DMPlex/PetscSF How to determine if local 
topology is other rank's ghost?

CAUTION: This email originated outside of Embry-Riddle Aeronautical University. 
Do not click links or open attachments unless you recognize the sender and know 
the content is safe.

On Tue, May 17, 2022 at 6:47 PM Toby Isaac 
<[email protected]<mailto:[email protected]>> wrote:
A leaf point is attached to a root point (in a star forest there are only 
leaves and roots), so that means that a root point would be the point that owns 
a degree of freedom and a leaf point would have a ghost value.

For a "point SF" of a DMPlex:

- Each process has a local numbering of mesh points (cells + edges + faces + 
vertices): they are all potential roots, so the number of these is what is 
returned by `nroots`.

- The number of ghost mesh points is `nleaves`.

- `ilocal` would be a list of the mesh points that are leaves (using the local 
numbering).

- For each leaf in `ilocal`, `iremote` describes the root it is attached to: 
which process it belongs to, and its id in *that* process's local numbering.

If you're trying to create dof numberings on your own, please consider 
PetscSectionCreateGlobalSection: 
<https://petsc.org/main/docs/manualpages/PetscSection/PetscSectionCreateGlobalSection/>.
  You supply the PetscSF and a PetscSection which says how many dofs there are 
for each point and whether any have essential boundary conditions, and it 
computes a global PetscSection that tells you what the global id is for each 
dof on this process.

Toby is exactly right. Also, if you want global numbering of points you can use

  https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexCreatePointNumbering/

and there is a similar thing for jsut cells or vertices.

  Thanks,

    Matt

On Tue, May 17, 2022 at 7:26 PM Ferrand, Jesus A. 
<[email protected]<mailto:[email protected]>> wrote:
Dear PETSc team:

I am working with a non-overlapping distributed plex (i.e., when I call 
DMPlexDistribute(), I input overlap = 0), so only vertices and edges appear as 
ghosts to the local ranks.
For preallocation of a parallel global stiffness matrix for FEA, I want to 
determine which locally owned vertices are ghosts to another rank.

From reading the paper on PetscSF 
(https://ieeexplore.ieee.org/document/9442258) I think I can answer my question 
by inspecting the PetscSF returned by DMPlexDistribute() with 
PetscSFGetGraph(). I am just confused by the root/leaf and ilocal/iremote 
terminology.

I read the manual page on PetscSFGetGraph() 
(https://petsc.org/release/docs/manualpages/PetscSF/PetscSFGetGraph.html) and 
that gave me the impression that I need to PetscSFBcast() the point IDs from 
foreign ranks to the local ones.

Is this correct?


[https://ieeexplore.ieee.org/assets/img/ieee_logo_smedia_200X200.png]<https://ieeexplore.ieee.org/document/9442258>
The PetscSF Scalable Communication Layer | IEEE Journals & Magazine | IEEE 
Xplore<https://ieeexplore.ieee.org/document/9442258>
PetscSF, the communication component of the Portable, Extensible Toolkit for 
Scientific Computation (PETSc), is designed to provide PETSc's communication 
infrastructure suitable for exascale computers that utilize GPUs and other 
accelerators. PetscSF provides a simple application programming interface (API) 
for managing common communication patterns in scientific computations by using 
a star ...
ieeexplore.ieee.org<http://ieeexplore.ieee.org>





Sincerely:

J.A. Ferrand

Embry-Riddle Aeronautical University - Daytona Beach FL

M.Sc. Aerospace Engineering | May 2022

B.Sc. Aerospace Engineering

B.Sc. Computational Mathematics



Sigma Gamma Tau

Tau Beta Pi



Phone: (386)-843-1829

Email(s): [email protected]<mailto:[email protected]>

    [email protected]<mailto:[email protected]>


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>

Reply via email to