Dear PETSc Team:

Hi! I'm working on a parallel version of a PETSc script that I wrote in serial 
using DMPlex. After calling DMPlexDistribute() each rank is assigned its own 
DAG where the points are numbered locally. For example, If I split a 100-cell 
mesh over 4 processors, each process numbers their cells 0-24 as oppossed to 
something like 0-24, 25-49, 50-74, and 74-99 on ranks 0,1,2, and 3 
respectively. The same happens for Face, Edge, and Vertex points in that the 
local DAG's renumber the ID's starting with 0 instead of using global numbering.

How can I distribute a mesh such that the global numbering is reflected in the 
local DAG's? If not, what would be the right way to retrieve the global 
numbering? I've seen the term "StarForest" in some [petsc-users] discussion 
threads discussing a similar issue but have little clue as how to use them.

I've looked at the following functions:

  *   DMPlexCreatePointNumbering() - Sounds like what I need, but I don't think 
it will work because I am relying on DMPlexGetDepthStratum() which returns 
bounds in local numbering.
  *   DMPlexGetCellNumbering() - Only converts Cells
  *   DMPlexGetVertexNumbering() - Only converts Vertices

Basically, what I want to do is to have a global matrix and have my MPI ranks 
call MatSetValues() on it (with ADD_VALUES as the mode). In my serial code I 
was relying on the global point numbering to build the matrix. Without it, I 
can't do it my way : (. I'm manually assembling a global stiffness matrix out 
of element stiffness matrices to run FEA.

Any help is much appreciated.

Sincerely:

J.A. Ferrand

Embry-Riddle Aeronautical University - Daytona Beach FL

M.Sc. Aerospace Engineering | May 2022

B.Sc. Aerospace Engineering

B.Sc. Computational Mathematics



Sigma Gamma Tau

Tau Beta Pi

Honors Program



Phone: (386)-843-1829

Email(s): ferra...@my.erau.edu

    jesus.ferr...@gmail.com

Reply via email to