Hi Matthew,

Ok, I did that but it segfault now.  Here are the order of the calls:

DMPlexCreate

DMSetDimension

DMPlexBuildFromCellListParallel(...)

DMPlexInterpolate

PetscPartitioner lPart;
DMPlexGetPartitioner(lDMSansOverlap, &lPart);
PetscPartitionerSetFromOptions(lPart);

DMSetUseNatural(lDMSansOverlap, PETSC_TRUE)

DMPlexDistribute

DMPlexGlobalToNaturalBegin

DMPlexGlobalToNaturalEnd


But it gives me the following error:

0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: DM global to natural SF not present.
If DMPlexDistribute() was called and a section was defined, report to [email protected].

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.15.0, Mar 30, 2021
[0]PETSC ERROR: MEF++.dev on a  named rohan by ericc Wed Jul 14 16:57:48 2021 [0]PETSC ERROR: Configure options --prefix=/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7 --with-mpi-compilers=1 --with-mpi-dir=/opt/openmpi-4.1.0_gcc7 --with-cxx-dialect=C++14 --with-make-np=12 --with-shared-libraries=1 --with-debugging=yes --with-memalign=64 --with-visibility=0 --with-64-bit-indices=0 --download-ml=yes --download-mumps=yes --download-superlu=yes --download-hpddm=yes --download-slepc=yes --download-superlu_dist=yes --download-parmetis=yes --download-ptscotch=yes --download-metis=yes --download-strumpack=yes --download-suitesparse=yes --download-hypre=yes --with-blaslapack-dir=/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64 --with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/.. --with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/.. --with-scalapack=1 --with-scalapack-include=/opt/intel/oneapi/mkl/2021.1.1/env/../include --with-scalapack-lib="-L/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64 -lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64" [0]PETSC ERROR: #1 DMPlexGlobalToNaturalBegin() at /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexnatural.c:245

What did I missed?

Thanks a lot!

Eric

On 2021-07-14 3:09 p.m., Matthew Knepley wrote:
On Wed, Jul 14, 2021 at 1:18 PM Eric Chamberland <[email protected] <mailto:[email protected]>> wrote:

    Hi,

    I want to use DMPlexDistribute from PETSc for computing
    overlapping and
    play with the different partitioners supported.

    However, after calling DMPlexDistribute, I noticed the elements are
    renumbered and then the original number is lost.

    What would be the best way to keep track of the element renumbering?

    a) Adding an optional parameter to let the user retrieve a vector or
    "IS" giving the old number?

    b) Adding a DMLabel (seems a wrong good solution)

    c) Other idea?

    Of course, I don't want to loose performances with the need of this
    "mapping"...


You need to call

https://petsc.org/release/docs/manualpages/DM/DMSetUseNatural.html <https://petsc.org/release/docs/manualpages/DM/DMSetUseNatural.html>

before call DMPlexDistribute(). Then you can call

https://petsc.org/release/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html <https://petsc.org/release/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html>

to map back to the original numbering if you want. This is the same thing that DMDA is doing.

  Thanks,

     Matt

    Thanks,

    Eric

-- Eric Chamberland, ing., M. Ing
    Professionnel de recherche
    GIREF/Université Laval
    (418) 656-2131 poste 41 22 42



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

--
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42

Reply via email to