Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Matthew Knepley
On Tue, May 2, 2023 at 11:03 AM Karthikeyan Chockalingam - STFC UKRI <
karthikeyan.chockalin...@stfc.ac.uk> wrote:

> Thank you Matt.
>
>
>
> I will look to find out those shared nodes. Sorry, I didn’t get it when
> you say “Roots are owned, and leaves are not owned”
>

That is the nomenclature from PetscSF.


>
>
> My question was specifically related to numbering – how do I start
> numbering in a partition from where I left off from the previous partition
> without double counting so that the node numbers are unique?
>

1) Determine the local sizes

Run over the local nodes. If any are not owned, do not count them.

2) Get the local offset nStart

Add up the local sizes to get the offset for each process using
MPI_Scan()

3) Number locally

   Run over local nodes and number each owned node, starting with nStart

  Thanks,

 Matt


>
>
Let's say I have a VECMPI which is distributed among the partitions. When I
> try to retrieve the data using VecGetValues, I often run into problems
> accessing non-local data (so, for now, I scatter the vector). When some
> nodes are shared, will I not always have this problem accessing those nodes
> from the wrong partition unless those nodes are ghosted? Maybe I am not
> thinking about it correctly.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
> *From: *Matthew Knepley 
> *Date: *Tuesday, 2 May 2023 at 13:35
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalin...@stfc.ac.uk>
> *Cc: *petsc-users@mcs.anl.gov 
> *Subject: *Re: [petsc-users] Node numbering in parallel partitioned mesh
>
> On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via
> petsc-users  wrote:
>
> Hello,
>
>
>
> This is not exactly a PETSc question. I have a parallel partitioned finite
> element mesh. What are the steps involved in having a contiguous but unique
> set of node numbering from one partition to the next? There are nodes which
> are shared between different partitions. Moreover, this partition has to
> coincide parallel partition of PETSc Vec/Mat, which ensures data locality.
>
>
>
> If you can post the algorithm or cite a reference, it will prove helpful.
>
>
>
> Somehow, you have to know what "nodes" are shared. Once you know this, you
> can make a rule for numbering, such
>
> as "the lowest rank gets the shared nodes". We encapsulate this ownership
> relation in the PetscSF. Roots are owned,
>
> and leaves are not owned. The rule above is not great for load balance, so
> we have an optimization routine for the
>
> simple PetscSF:
> https://petsc.org/main/manualpages/DMPlex/DMPlexRebalanceSharedPoints/
>
>
>
>   Thanks,
>
>
>
>  Matt
>
>
>
> Many thanks.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>


Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Karthikeyan Chockalingam - STFC UKRI via petsc-users
Thank you Matt.

I will look to find out those shared nodes. Sorry, I didn’t get it when you say 
“Roots are owned, and leaves are not owned”

My question was specifically related to numbering – how do I start numbering in 
a partition from where I left off from the previous partition without double 
counting so that the node numbers are unique?

Let's say I have a VECMPI which is distributed among the partitions. When I try 
to retrieve the data using VecGetValues, I often run into problems accessing 
non-local data (so, for now, I scatter the vector). When some nodes are shared, 
will I not always have this problem accessing those nodes from the wrong 
partition unless those nodes are ghosted? Maybe I am not thinking about it 
correctly.

Kind regards,
Karthik.


From: Matthew Knepley 
Date: Tuesday, 2 May 2023 at 13:35
To: Chockalingam, Karthikeyan (STFC,DL,HC) 
Cc: petsc-users@mcs.anl.gov 
Subject: Re: [petsc-users] Node numbering in parallel partitioned mesh
On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via 
petsc-users mailto:petsc-users@mcs.anl.gov>> wrote:
Hello,

This is not exactly a PETSc question. I have a parallel partitioned finite 
element mesh. What are the steps involved in having a contiguous but unique set 
of node numbering from one partition to the next? There are nodes which are 
shared between different partitions. Moreover, this partition has to coincide 
parallel partition of PETSc Vec/Mat, which ensures data locality.

If you can post the algorithm or cite a reference, it will prove helpful.

Somehow, you have to know what "nodes" are shared. Once you know this, you can 
make a rule for numbering, such
as "the lowest rank gets the shared nodes". We encapsulate this ownership 
relation in the PetscSF. Roots are owned,
and leaves are not owned. The rule above is not great for load balance, so we 
have an optimization routine for the
simple PetscSF: 
https://petsc.org/main/manualpages/DMPlex/DMPlexRebalanceSharedPoints/

  Thanks,

 Matt

Many thanks.

Kind regards,
Karthik.



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>


Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Barry Smith

   Assuming you have generated your renumbering, you can use 
https://petsc.org/release/manualpages/AO/AO/#ao to convert lists in the old (or 
new) numbering to the new (or old) numbering.

  Barry


> On May 2, 2023, at 8:34 AM, Matthew Knepley  wrote:
> 
> On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via 
> petsc-users mailto:petsc-users@mcs.anl.gov>> wrote:
>> Hello,
>> 
>>  
>> 
>> This is not exactly a PETSc question. I have a parallel partitioned finite 
>> element mesh. What are the steps involved in having a contiguous but unique 
>> set of node numbering from one partition to the next? There are nodes which 
>> are shared between different partitions. Moreover, this partition has to 
>> coincide parallel partition of PETSc Vec/Mat, which ensures data locality.
>> 
>>  
>> 
>> If you can post the algorithm or cite a reference, it will prove helpful.
>> 
> 
> Somehow, you have to know what "nodes" are shared. Once you know this, you 
> can make a rule for numbering, such
> as "the lowest rank gets the shared nodes". We encapsulate this ownership 
> relation in the PetscSF. Roots are owned,
> and leaves are not owned. The rule above is not great for load balance, so we 
> have an optimization routine for the
> simple PetscSF: 
> https://petsc.org/main/manualpages/DMPlex/DMPlexRebalanceSharedPoints/
> 
>   Thanks,
> 
>  Matt
>  
>> Many thanks.
>> 
>>  
>> 
>> Kind regards,
>> 
>> Karthik.
>> 
>>  
>> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ 



Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Matthew Knepley
On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via
petsc-users  wrote:

> Hello,
>
>
>
> This is not exactly a PETSc question. I have a parallel partitioned finite
> element mesh. What are the steps involved in having a contiguous but unique
> set of node numbering from one partition to the next? There are nodes which
> are shared between different partitions. Moreover, this partition has to
> coincide parallel partition of PETSc Vec/Mat, which ensures data locality.
>
>
>
> If you can post the algorithm or cite a reference, it will prove helpful.
>

Somehow, you have to know what "nodes" are shared. Once you know this, you
can make a rule for numbering, such
as "the lowest rank gets the shared nodes". We encapsulate this ownership
relation in the PetscSF. Roots are owned,
and leaves are not owned. The rule above is not great for load balance, so
we have an optimization routine for the
simple PetscSF:
https://petsc.org/main/manualpages/DMPlex/DMPlexRebalanceSharedPoints/

  Thanks,

 Matt


> Many thanks.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/