Hi Matthew,

we tried to use that.  Now, we discovered that:

1- even if we "ask" for sfNatural creation with DMSetUseNatural, it is not created because DMPlexCreateGlobalToNaturalSF looks for a "section": this is not documented in DMSetUseNaturalso we are asking ourselfs: "is this a permanent feature or a temporary situation?"

2- We then tried to create a "section" in different manners: we took the code into the example petsc/src/dm/impls/plex/tests/ex15.c.  However, we ended up with a segfault:

corrupted size vs. prev_size
[rohan:07297] *** Process received signal ***
[rohan:07297] Signal: Aborted (6)
[rohan:07297] Signal code:  (-6)
[rohan:07297] [ 0] /lib64/libpthread.so.0(+0x13f80)[0x7f6f13be3f80]
[rohan:07297] [ 1] /lib64/libc.so.6(gsignal+0x10b)[0x7f6f109b718b]
[rohan:07297] [ 2] /lib64/libc.so.6(abort+0x175)[0x7f6f109b8585]
[rohan:07297] [ 3] /lib64/libc.so.6(+0x7e2f7)[0x7f6f109fb2f7]
[rohan:07297] [ 4] /lib64/libc.so.6(+0x857ea)[0x7f6f10a027ea]
[rohan:07297] [ 5] /lib64/libc.so.6(+0x86036)[0x7f6f10a03036]
[rohan:07297] [ 6] /lib64/libc.so.6(+0x861a3)[0x7f6f10a031a3]
[rohan:07297] [ 7] /lib64/libc.so.6(+0x88740)[0x7f6f10a05740]
[rohan:07297] [ 8] /lib64/libc.so.6(__libc_malloc+0x1b8)[0x7f6f10a070c8]
[rohan:07297] [ 9] /lib64/libc.so.6(__backtrace_symbols+0x134)[0x7f6f10a8b064] [rohan:07297] [10] /home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z12reqBacktraceRNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE+0x4e)[0x4538ce] [rohan:07297] [11] /home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z15attacheDebuggerv+0x120)[0x4523c0] [rohan:07297] [12] /home/mefpp_ericc/GIREF/lib/libgiref_dev_Util.so(traitementSignal+0x612)[0x7f6f28f503a2]
[rohan:07297] [13] /lib64/libc.so.6(+0x3a210)[0x7f6f109b7210]

[rohan:07297] [14] /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscTrMallocDefault+0x6fd)[0x7f6f22f1b8ed] [rohan:07297] [15] /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscMallocA+0x5cd)[0x7f6f22f19c2d] [rohan:07297] [16] /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscSFCreateSectionSF+0xb48)[0x7f6f23268e18] [rohan:07297] [17] /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexCreateGlobalToNaturalSF+0x13b2)[0x7f6f241a5602] [rohan:07297] [18] /opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexDistribute+0x39b1)[0x7f6f23fdca21]

If we do not create a section, the call to DMPlexDistribute is successful, but DMPlexGetGlobalToNaturalSF return a null SF pointer...

Here are the operations we are calling ( this is almost the code we are using, I just removed verifications and creation of the connectivity which use our parallel structure and code):

===========

  PetscInt* lCells      = 0;
  PetscInt  lNumCorners = 0;
  PetscInt  lDimMail    = 0;
  PetscInt  lnumCells   = 0;

  //At this point we create the cells for PETSc expected input for DMPlexBuildFromCellListParallel and set lNumCorners, lDimMail and lnumCells to correct values.
  ...

  DM       lDMBete = 0
  DMPlexCreate(lMPIComm,&lDMBete);

  DMSetDimension(lDMBete, lDimMail);

  DMPlexBuildFromCellListParallel(lDMBete,
                                  lnumCells,
                                  PETSC_DECIDE,
pLectureElementsLocaux.reqNbTotalSommets(),
                                  lNumCorners,
                                  lCells,
                                  PETSC_NULL);

  DM lDMBeteInterp = 0;
  DMPlexInterpolate(lDMBete, &lDMBeteInterp);
  DMDestroy(&lDMBete);
  lDMBete = lDMBeteInterp;

  DMSetUseNatural(lDMBete,PETSC_TRUE);

  PetscSF lSFMigrationSansOvl = 0;
  PetscSF lSFMigrationOvl = 0;
  DM lDMDistribueSansOvl = 0;
  DM lDMAvecOverlap = 0;

  PetscPartitioner lPart;
  DMPlexGetPartitioner(lDMBete, &lPart);
  PetscPartitionerSetFromOptions(lPart);

  PetscSection   section;
  PetscInt       numFields   = 1;
  PetscInt       numBC       = 0;
  PetscInt       numComp[1]  = {1};
  PetscInt       numDof[4]   = {1, 0, 0, 0};
  PetscInt       bcFields[1] = {0};
  IS             bcPoints[1] = {NULL};

  DMSetNumFields(lDMBete, numFields);

  DMPlexCreateSection(lDMBete, NULL, numComp, numDof, numBC, bcFields, bcPoints, NULL, NULL, &section);
  DMSetLocalSection(lDMBete, section);

  DMPlexDistribute(lDMBete, 0, &lSFMigrationSansOvl, &lDMDistribueSansOvl); // segfault!

===========

So we have other question/remarks:

3- Maybe PETSc expect something specific that is missing/not verified: for example, we didn't gave any coordinates since we just want to partition and compute overlap for the mesh... and then recover our element numbers in a "simple way"

4- We are telling ourselves it is somewhat a "big price to pay" to have to build an unused section to have the global to natural ordering set ?  Could this requirement be avoided?

5- Are there any improvement towards our usages in 3.16 release?

Thanks,

Eric


On 2021-09-29 7:39 p.m., Matthew Knepley wrote:
On Wed, Sep 29, 2021 at 5:18 PM Eric Chamberland <[email protected] <mailto:[email protected]>> wrote:

    Hi,

    I come back with _almost_ the original question:

    I would like to add an integer information (*our* original element
    number, not petsc one) on each element of the DMPlex I create with
    DMPlexBuildFromCellListParallel.

    I would like this interger to be distribruted by or the same way
    DMPlexDistribute distribute the mesh.

    Is it possible to do this?


I think we already have support for what you want. If you call

https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html <https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html>

before DMPlexDistribute(), it will compute a PetscSF encoding the global to natural map. You
can get it with

https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html <https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html>

and use it with

https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html <https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html>

Is this sufficient?

  Thanks,

     Matt

    Thanks,

    Eric

    On 2021-07-14 1:18 p.m., Eric Chamberland wrote:
    > Hi,
    >
    > I want to use DMPlexDistribute from PETSc for computing overlapping
    > and play with the different partitioners supported.
    >
    > However, after calling DMPlexDistribute, I noticed the elements are
    > renumbered and then the original number is lost.
    >
    > What would be the best way to keep track of the element renumbering?
    >
    > a) Adding an optional parameter to let the user retrieve a
    vector or
    > "IS" giving the old number?
    >
    > b) Adding a DMLabel (seems a wrong good solution)
    >
    > c) Other idea?
    >
    > Of course, I don't want to loose performances with the need of this
    > "mapping"...
    >
    > Thanks,
    >
    > Eric
    >
-- Eric Chamberland, ing., M. Ing
    Professionnel de recherche
    GIREF/Université Laval
    (418) 656-2131 poste 41 22 42



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

--
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42

Reply via email to