Hi Matthew,

ok, I started back from your ex44.c example and added the global array of coordinates.  I just have to code the creation of the local coordinates now.

Eric

On 2021-10-20 6:55 p.m., Matthew Knepley wrote:
On Wed, Oct 20, 2021 at 3:06 PM Eric Chamberland <[email protected] <mailto:[email protected]>> wrote:

    Hi Matthew,

    we tried to reproduce the error in a simple example.

    The context is the following: We hard coded the mesh and initial
    partition into the code (see sConnectivity and sInitialPartition)
    for 2 ranks and try to create a section in order to use the
    DMPlexNaturalToGlobalBegin function to retreive our initial
    element numbers.

    Now the call to DMPlexDistribute give different errors depending
    on what type of component we ask the field to be created.  For our
    objective, we would like a global field to be created on elements
    only (like a P0 interpolation).

    We now have the following error generated:

    [0]PETSC ERROR: --------------------- Error Message
    --------------------------------------------------------------
    [0]PETSC ERROR: Petsc has generated inconsistent data
    [0]PETSC ERROR: Inconsistency in indices, 18 should be 17
    [0]PETSC ERROR: See
    https://www.mcs.anl.gov/petsc/documentation/faq.html
    <https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble
    shooting.
    [0]PETSC ERROR: Petsc Release Version 3.15.0, Mar 30, 2021
    [0]PETSC ERROR: ./bug on a  named rohan by ericc Wed Oct 20
    14:52:36 2021
    [0]PETSC ERROR: Configure options
    --prefix=/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7
    --with-mpi-compilers=1 --with-mpi-dir=/opt/openmpi-4.1.0_gcc7
    --with-cxx-dialect=C++14 --with-make-np=12
    --with-shared-libraries=1 --with-debugging=yes --with-memalign=64
    --with-visibility=0 --with-64-bit-indices=0 --download-ml=yes
    --download-mumps=yes --download-superlu=yes --download-hpddm=yes
    --download-slepc=yes --download-superlu_dist=yes
    --download-parmetis=yes --download-ptscotch=yes
    --download-metis=yes --download-strumpack=yes
    --download-suitesparse=yes --download-hypre=yes
    --with-blaslapack-dir=/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64
    --with-mkl_pardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/..
    --with-mkl_cpardiso-dir=/opt/intel/oneapi/mkl/2021.1.1/env/..
    --with-scalapack=1
    --with-scalapack-include=/opt/intel/oneapi/mkl/2021.1.1/env/../include
    --with-scalapack-lib="-L/opt/intel/oneapi/mkl/2021.1.1/env/../lib/intel64
    -lmkl_scalapack_lp64 -lmkl_blacs_openmpi_lp64"
    [0]PETSC ERROR: #1 PetscSFCreateSectionSF() at
    /tmp/ompi-opt/petsc-3.15.0-debug/src/vec/is/sf/utils/sfutils.c:409
    [0]PETSC ERROR: #2 DMPlexCreateGlobalToNaturalSF() at
    /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexnatural.c:184
    [0]PETSC ERROR: #3 DMPlexDistribute() at
    /tmp/ompi-opt/petsc-3.15.0-debug/src/dm/impls/plex/plexdistribute.c:1733
    [0]PETSC ERROR: #4 main() at bug_section.cc:159
    [0]PETSC ERROR: No PETSc Option Table entries
    [0]PETSC ERROR: ----------------End of Error Message -------send
    entire error message to [email protected]
    <mailto:[email protected]>----------

    Hope the attached code is self-explaining, note that to make it
    short, we have not included the final part of it, just the buggy
    part we are encountering right now...

    Thanks for your insights,

Thanks for making the example. I tweaked it slightly. I put in a test case that just makes a parallel 7 x 10 quad mesh. This works fine. Thus I think it must be something connected with the original mesh. It is hard to get a handle on it without the coordinates. Do you think you could put the coordinate array in? I have added the code to load them (see attached file).

  Thanks,

     Matt

    Eric

    On 2021-10-06 9:23 p.m., Matthew Knepley wrote:
    On Wed, Oct 6, 2021 at 5:43 PM Eric Chamberland
    <[email protected]
    <mailto:[email protected]>> wrote:

        Hi Matthew,

        we tried to use that.  Now, we discovered that:

        1- even if we "ask" for sfNatural creation with
        DMSetUseNatural, it is not created because
        DMPlexCreateGlobalToNaturalSF looks for a "section": this is
        not documented in DMSetUseNaturalso we are asking ourselfs:
        "is this a permanent feature or a temporary situation?"

    I think explaining this will help clear up a lot.

    What the Natural2Global map does is permute a solution vector
    into the ordering that it would have had prior to mesh distribution.
    Now, in order to do this permutation, I need to know the original
    (global) data layout. If it is not specified _before_
    distribution, we
    cannot build the permutation.  The section describes the data
    layout, so I need it before distribution.

    I cannot think of another way that you would implement this, but
    if you want something else, let me know.

        2- We then tried to create a "section" in different manners:
        we took the code into the example
        petsc/src/dm/impls/plex/tests/ex15.c. However, we ended up
        with a segfault:

        corrupted size vs. prev_size
        [rohan:07297] *** Process received signal ***
        [rohan:07297] Signal: Aborted (6)
        [rohan:07297] Signal code:  (-6)
        [rohan:07297] [ 0]
        /lib64/libpthread.so.0(+0x13f80)[0x7f6f13be3f80]
        [rohan:07297] [ 1]
        /lib64/libc.so.6(gsignal+0x10b)[0x7f6f109b718b]
        [rohan:07297] [ 2] /lib64/libc.so.6(abort+0x175)[0x7f6f109b8585]
        [rohan:07297] [ 3] /lib64/libc.so.6(+0x7e2f7)[0x7f6f109fb2f7]
        [rohan:07297] [ 4] /lib64/libc.so.6(+0x857ea)[0x7f6f10a027ea]
        [rohan:07297] [ 5] /lib64/libc.so.6(+0x86036)[0x7f6f10a03036]
        [rohan:07297] [ 6] /lib64/libc.so.6(+0x861a3)[0x7f6f10a031a3]
        [rohan:07297] [ 7] /lib64/libc.so.6(+0x88740)[0x7f6f10a05740]
        [rohan:07297] [ 8]
        /lib64/libc.so.6(__libc_malloc+0x1b8)[0x7f6f10a070c8]
        [rohan:07297] [ 9]
        /lib64/libc.so.6(__backtrace_symbols+0x134)[0x7f6f10a8b064]
        [rohan:07297] [10]
        
/home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z12reqBacktraceRNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE+0x4e)[0x4538ce]
        [rohan:07297] [11]
        
/home/mefpp_ericc/GIREF/bin/MEF++.dev(_Z15attacheDebuggerv+0x120)[0x4523c0]
        [rohan:07297] [12]
        
/home/mefpp_ericc/GIREF/lib/libgiref_dev_Util.so(traitementSignal+0x612)[0x7f6f28f503a2]
        [rohan:07297] [13] /lib64/libc.so.6(+0x3a210)[0x7f6f109b7210]

        [rohan:07297] [14]
        
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscTrMallocDefault+0x6fd)[0x7f6f22f1b8ed]
        [rohan:07297] [15]
        
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscMallocA+0x5cd)[0x7f6f22f19c2d]
        [rohan:07297] [16]
        
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(PetscSFCreateSectionSF+0xb48)[0x7f6f23268e18]
        [rohan:07297] [17]
        
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexCreateGlobalToNaturalSF+0x13b2)[0x7f6f241a5602]
        [rohan:07297] [18]
        
/opt/petsc-3.15.0_debug_openmpi-4.1.0_gcc7/lib/libpetsc.so.3.15(DMPlexDistribute+0x39b1)[0x7f6f23fdca21]

    I am not sure what happened here, but if you could send a sample
    code, I will figure it out.

        If we do not create a section, the call to DMPlexDistribute
        is successful, but DMPlexGetGlobalToNaturalSF return a null
        SF pointer...

    Yes, it just ignores it in this case because it does not have a
    global layout.

        Here are the operations we are calling ( this is almost the
        code we are using, I just removed verifications and creation
        of the connectivity which use our parallel structure and code):

        ===========

          PetscInt* lCells      = 0;
          PetscInt  lNumCorners = 0;
          PetscInt  lDimMail    = 0;
          PetscInt  lnumCells   = 0;

          //At this point we create the cells for PETSc expected
        input for DMPlexBuildFromCellListParallel and set
        lNumCorners, lDimMail and lnumCells to correct values.
          ...

          DM       lDMBete = 0
          DMPlexCreate(lMPIComm,&lDMBete);

          DMSetDimension(lDMBete, lDimMail);

          DMPlexBuildFromCellListParallel(lDMBete,
                                          lnumCells,
        PETSC_DECIDE,
        pLectureElementsLocaux.reqNbTotalSommets(),
                                          lNumCorners,
                                          lCells,
                                          PETSC_NULL);

          DM lDMBeteInterp = 0;
          DMPlexInterpolate(lDMBete, &lDMBeteInterp);
          DMDestroy(&lDMBete);
          lDMBete = lDMBeteInterp;

          DMSetUseNatural(lDMBete,PETSC_TRUE);

          PetscSF lSFMigrationSansOvl = 0;
          PetscSF lSFMigrationOvl = 0;
          DM lDMDistribueSansOvl = 0;
          DM lDMAvecOverlap = 0;

          PetscPartitioner lPart;
          DMPlexGetPartitioner(lDMBete, &lPart);
          PetscPartitionerSetFromOptions(lPart);

          PetscSection   section;
          PetscInt       numFields   = 1;
          PetscInt       numBC       = 0;
          PetscInt       numComp[1]  = {1};
          PetscInt       numDof[4]   = {1, 0, 0, 0};
          PetscInt       bcFields[1] = {0};
          IS             bcPoints[1] = {NULL};

          DMSetNumFields(lDMBete, numFields);

          DMPlexCreateSection(lDMBete, NULL, numComp, numDof, numBC,
        bcFields, bcPoints, NULL, NULL, &section);
          DMSetLocalSection(lDMBete, section);

          DMPlexDistribute(lDMBete, 0, &lSFMigrationSansOvl,
        &lDMDistribueSansOvl); // segfault!

        ===========

        So we have other question/remarks:

        3- Maybe PETSc expect something specific that is missing/not
        verified: for example, we didn't gave any coordinates since
        we just want to partition and compute overlap for the mesh...
        and then recover our element numbers in a "simple way"

        4- We are telling ourselves it is somewhat a "big price to
        pay" to have to build an unused section to have the global to
        natural ordering set ?  Could this requirement be avoided?

    I don't think so. There would have to be _some_ way of describing
    your data layout in terms of mesh points, and I do not see how
    you could use less memory doing that.

        5- Are there any improvement towards our usages in 3.16 release?

    Let me try and run the code above.

      Thanks,

         Matt

        Thanks,

        Eric


        On 2021-09-29 7:39 p.m., Matthew Knepley wrote:
        On Wed, Sep 29, 2021 at 5:18 PM Eric Chamberland
        <[email protected]
        <mailto:[email protected]>> wrote:

            Hi,

            I come back with _almost_ the original question:

            I would like to add an integer information (*our*
            original element
            number, not petsc one) on each element of the DMPlex I
            create with
            DMPlexBuildFromCellListParallel.

            I would like this interger to be distribruted by or the
            same way
            DMPlexDistribute distribute the mesh.

            Is it possible to do this?


        I think we already have support for what you want. If you call

        https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html
        <https://petsc.org/main/docs/manualpages/DM/DMSetUseNatural.html>

        before DMPlexDistribute(), it will compute a PetscSF
        encoding the global to natural map. You
        can get it with

        
https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html
        
<https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGetGlobalToNaturalSF.html>

        and use it with

        
https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html
        
<https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html>

        Is this sufficient?

          Thanks,

             Matt

            Thanks,

            Eric

            On 2021-07-14 1:18 p.m., Eric Chamberland wrote:
            > Hi,
            >
            > I want to use DMPlexDistribute from PETSc for
            computing overlapping
            > and play with the different partitioners supported.
            >
            > However, after calling DMPlexDistribute, I noticed the
            elements are
            > renumbered and then the original number is lost.
            >
            > What would be the best way to keep track of the
            element renumbering?
            >
            > a) Adding an optional parameter to let the user
            retrieve a vector or
            > "IS" giving the old number?
            >
            > b) Adding a DMLabel (seems a wrong good solution)
            >
            > c) Other idea?
            >
            > Of course, I don't want to loose performances with the
            need of this
            > "mapping"...
            >
            > Thanks,
            >
            > Eric
            >
-- Eric Chamberland, ing., M. Ing
            Professionnel de recherche
            GIREF/Université Laval
            (418) 656-2131 poste 41 22 42



-- What most experimenters take for granted before they begin
        their experiments is infinitely more interesting than any
        results to which their experiments lead.
        -- Norbert Wiener

        https://www.cse.buffalo.edu/~knepley/
        <http://www.cse.buffalo.edu/~knepley/>

-- Eric Chamberland, ing., M. Ing
        Professionnel de recherche
        GIREF/Université Laval
        (418) 656-2131 poste 41 22 42



-- What most experimenters take for granted before they begin their
    experiments is infinitely more interesting than any results to
    which their experiments lead.
    -- Norbert Wiener

    https://www.cse.buffalo.edu/~knepley/
    <http://www.cse.buffalo.edu/~knepley/>

-- Eric Chamberland, ing., M. Ing
    Professionnel de recherche
    GIREF/Université Laval
    (418) 656-2131 poste 41 22 42



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

--
Eric Chamberland, ing., M. Ing
Professionnel de recherche
GIREF/Université Laval
(418) 656-2131 poste 41 22 42

static char help[] = "The main goal of this code is to retrieve the original element numbers as found in the "
                     "initial partitions (sInitialPartition)... but after the call to DMPlexDistribute";

#include <petsc.h>

PetscReal sCoords[88][2] = {
{0.00000000000000000e+00,0.00000000000000000e+00},
{2.00000000000000000e+00,0.00000000000000000e+00},
{0.00000000000000000e+00,1.00000000000000000e+00},
{2.00000000000000000e+00,1.00000000000000000e+00},
{1.99999999999629308e-01,0.00000000000000000e+00},
{3.99999999999115674e-01,0.00000000000000000e+00},
{5.99999999998532818e-01,0.00000000000000000e+00},
{7.99999999997949796e-01,0.00000000000000000e+00},
{9.99999999997388533e-01,0.00000000000000000e+00},
{1.19999999999789408e+00,0.00000000000000000e+00},
{1.39999999999842006e+00,0.00000000000000000e+00},
{1.59999999999894693e+00,0.00000000000000000e+00},
{1.79999999999947402e+00,0.00000000000000000e+00},
{1.99999999999629308e-01,1.00000000000000000e+00},
{3.99999999999115674e-01,1.00000000000000000e+00},
{5.99999999998532818e-01,1.00000000000000000e+00},
{7.99999999997949796e-01,1.00000000000000000e+00},
{9.99999999997388533e-01,1.00000000000000000e+00},
{1.19999999999789408e+00,1.00000000000000000e+00},
{1.39999999999842006e+00,1.00000000000000000e+00},
{1.59999999999894693e+00,1.00000000000000000e+00},
{1.79999999999947402e+00,1.00000000000000000e+00},
{0.00000000000000000e+00,1.42857142857142794e-01},
{0.00000000000000000e+00,2.85714285714285698e-01},
{0.00000000000000000e+00,4.28571428571428492e-01},
{0.00000000000000000e+00,5.71428571428571397e-01},
{0.00000000000000000e+00,7.14285714285714302e-01},
{0.00000000000000000e+00,8.57142857142857095e-01},
{2.00000000000000000e+00,1.42857142857142794e-01},
{2.00000000000000000e+00,2.85714285714285698e-01},
{2.00000000000000000e+00,4.28571428571428492e-01},
{2.00000000000000000e+00,5.71428571428571397e-01},
{2.00000000000000000e+00,7.14285714285714302e-01},
{2.00000000000000000e+00,8.57142857142857095e-01},
{1.99999999999629391e-01,1.42857142857142794e-01},
{1.99999999999629391e-01,2.85714285714285698e-01},
{1.99999999999629391e-01,4.28571428571428714e-01},
{1.99999999999629391e-01,5.71428571428571397e-01},
{1.99999999999629391e-01,7.14285714285714302e-01},
{1.99999999999629391e-01,8.57142857142857095e-01},
{3.99999999999115674e-01,1.42857142857142794e-01},
{3.99999999999115674e-01,2.85714285714285698e-01},
{3.99999999999115674e-01,4.28571428571428714e-01},
{3.99999999999115674e-01,5.71428571428571397e-01},
{3.99999999999115674e-01,7.14285714285714302e-01},
{3.99999999999115674e-01,8.57142857142857095e-01},
{5.99999999998532596e-01,1.42857142857142794e-01},
{5.99999999998532818e-01,2.85714285714285698e-01},
{5.99999999998532818e-01,4.28571428571428603e-01},
{5.99999999998532818e-01,5.71428571428571397e-01},
{5.99999999998532818e-01,7.14285714285714302e-01},
{5.99999999998532818e-01,8.57142857142857206e-01},
{7.99999999997949685e-01,1.42857142857142794e-01},
{7.99999999997950018e-01,2.85714285714285698e-01},
{7.99999999997949685e-01,4.28571428571428714e-01},
{7.99999999997949796e-01,5.71428571428571397e-01},
{7.99999999997950018e-01,7.14285714285714191e-01},
{7.99999999997950018e-01,8.57142857142856984e-01},
{9.99999999997388533e-01,1.42857142857142794e-01},
{9.99999999997388533e-01,2.85714285714285698e-01},
{9.99999999997388533e-01,4.28571428571428714e-01},
{9.99999999997388533e-01,5.71428571428571397e-01},
{9.99999999997388533e-01,7.14285714285714302e-01},
{9.99999999997388533e-01,8.57142857142857095e-01},
{1.19999999999789408e+00,1.42857142857142794e-01},
{1.19999999999789408e+00,2.85714285714285698e-01},
{1.19999999999789408e+00,4.28571428571428492e-01},
{1.19999999999789408e+00,5.71428571428571397e-01},
{1.19999999999789408e+00,7.14285714285714302e-01},
{1.19999999999789408e+00,8.57142857142857095e-01},
{1.39999999999842006e+00,1.42857142857142794e-01},
{1.39999999999842006e+00,2.85714285714285698e-01},
{1.39999999999842006e+00,4.28571428571428714e-01},
{1.39999999999842006e+00,5.71428571428571397e-01},
{1.39999999999842006e+00,7.14285714285714302e-01},
{1.39999999999842006e+00,8.57142857142857095e-01},
{1.59999999999894693e+00,1.42857142857142794e-01},
{1.59999999999894693e+00,2.85714285714285698e-01},
{1.59999999999894693e+00,4.28571428571428714e-01},
{1.59999999999894693e+00,5.71428571428571397e-01},
{1.59999999999894693e+00,7.14285714285714302e-01},
{1.59999999999894693e+00,8.57142857142857095e-01},
{1.79999999999947402e+00,1.42857142857142794e-01},
{1.79999999999947291e+00,2.85714285714285698e-01},
{1.79999999999947402e+00,4.28571428571428492e-01},
{1.79999999999947402e+00,5.71428571428571397e-01},
{1.79999999999947402e+00,7.14285714285714302e-01},
{1.79999999999947402e+00,8.57142857142857095e-01}
};

//Connectivity of a 7x10 rectangular mesh of quads :
PetscInt sConnectivity[70][4] = {
{0,4,34,22},
{22,34,35,23},
{23,35,36,24},
{24,36,37,25},
{25,37,38,26},
{26,38,39,27},
{27,39,13,2},
{4,5,40,34},
{34,40,41,35},
{35,41,42,36},
{36,42,43,37},
{37,43,44,38},
{38,44,45,39},
{39,45,14,13},
{5,6,46,40},
{40,46,47,41},
{41,47,48,42},
{42,48,49,43},
{43,49,50,44},
{44,50,51,45},
{45,51,15,14},
{6,7,52,46},
{46,52,53,47},
{47,53,54,48},
{48,54,55,49},
{49,55,56,50},
{50,56,57,51},
{51,57,16,15},
{7,8,58,52},
{52,58,59,53},
{53,59,60,54},
{54,60,61,55},
{55,61,62,56},
{56,62,63,57},
{57,63,17,16},
{8,9,64,58},
{58,64,65,59},
{59,65,66,60},
{60,66,67,61},
{61,67,68,62},
{62,68,69,63},
{63,69,18,17},
{9,10,70,64},
{64,70,71,65},
{65,71,72,66},
{66,72,73,67},
{67,73,74,68},
{68,74,75,69},
{69,75,19,18},
{10,11,76,70},
{70,76,77,71},
{71,77,78,72},
{72,78,79,73},
{73,79,80,74},
{74,80,81,75},
{75,81,20,19},
{11,12,82,76},
{76,82,83,77},
{77,83,84,78},
{78,84,85,79},
{79,85,86,80},
{80,86,87,81},
{81,87,21,20},
{12,1,28,82},
{82,28,29,83},
{83,29,30,84},
{84,30,31,85},
{85,31,32,86},
{86,32,33,87},
{87,33,3,21}
};

//The initial partitions given by reading (simulating a read by blocks for large meshes):
PetscInt sInitialPartition[2][35] = {
  {0,1,2,6,7,8,12,13,14,18,19,20,24,25,26,30,31,32,36,37,38,42,43,44,48,49,50,54,55,56,60,61,62,66,67},
  {3,4,5,9,10,11,15,16,17,21,22,23,27,28,29,33,34,35,39,40,41,45,46,47,51,52,53,57,58,59,63,64,65,68,69}
};

int main(int argc, char **argv)
{
  const PetscInt   Nc   = 35; //Same on each rank for this example...
  const PetscInt   Ncor = 4;
  const PetscInt   dim  = 2;
  const PetscInt   Nv   = 88;
  DM               dm, idm, ddm;
  PetscSF          sfVert, sfMig, sfPart;
  PetscPartitioner part;
  PetscSection     s;
  PetscInt        *cells, c;
  PetscMPIInt      size, rank;
  PetscBool        box = PETSC_FALSE, field = PETSC_FALSE;
  PetscErrorCode   ierr;

  ierr = PetscInitialize(&argc, &argv, NULL, help);if (ierr) return ierr;
  ierr = MPI_Comm_size(PETSC_COMM_WORLD, &size);CHKERRMPI(ierr);
  ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank);CHKERRMPI(ierr);
  if (size != 2) SETERRQ(PETSC_COMM_WORLD, PETSC_ERR_SUP, "This is a 2 processors example only");
  ierr = PetscOptionsGetBool(NULL, NULL, "-box", &box, NULL);CHKERRQ(ierr);
  ierr = PetscOptionsGetBool(NULL, NULL, "-field", &field, NULL);CHKERRQ(ierr);

  ierr = DMPlexCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
  if (box) {
    ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
    ierr = DMSetFromOptions(dm);CHKERRQ(ierr);
  } else {
    ierr = PetscMalloc1(Nc * Ncor, &cells);CHKERRQ(ierr);
    for (c = 0; c < Nc; ++c) {
      PetscInt cell = sInitialPartition[rank][c], cor;

      for (cor = 0; cor < Ncor; ++cor) {
        cells[c*Ncor + cor] = sConnectivity[cell][cor];
      }
    }
    ierr = DMSetDimension(dm, dim);CHKERRQ(ierr);
    ierr = DMPlexBuildFromCellListParallel(dm, Nc, PETSC_DECIDE, Nv, Ncor, cells, &sfVert);CHKERRQ(ierr);
    //ierr = DMPlexBuildCoordinatesFromCellListParallel(dm, dim, sfVert, coords);CHKERRQ(ierr);
    ierr = PetscSFDestroy(&sfVert);CHKERRQ(ierr);
    ierr = PetscFree(cells);CHKERRQ(ierr);
    ierr = DMPlexInterpolate(dm, &idm);CHKERRQ(ierr);
    ierr = DMDestroy(&dm);CHKERRQ(ierr);
    dm   = idm;
  }
  ierr = DMSetUseNatural(dm, PETSC_TRUE);CHKERRQ(ierr);
  ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);

  if (field) {
   const PetscInt Nf         = 1;
   const PetscInt numComp[1] = {1};
   const PetscInt numDof[3]  = {1, 0, 0};
   const PetscInt numBC      = 0;

   ierr = DMSetNumFields(dm, Nf); CHKERRQ(ierr);
   ierr = DMPlexCreateSection(dm, NULL, numComp, numDof, numBC, NULL, NULL, NULL, NULL, &s);  CHKERRQ(ierr);
   ierr = DMSetLocalSection(dm, s); CHKERRQ(ierr);
   ierr = PetscSectionView(s, PETSC_VIEWER_STDOUT_WORLD); CHKERRQ(ierr);
   ierr = PetscSectionDestroy(&s);CHKERRQ(ierr);
  }

  ierr = DMPlexGetPartitioner(dm, &part);CHKERRQ(ierr);
  ierr = PetscPartitionerSetFromOptions(part);CHKERRQ(ierr);

  ierr = DMPlexDistribute(dm, 0, &sfMig, &ddm); CHKERRQ(ierr);
  ierr = DMDestroy(&dm);CHKERRQ(ierr);
  ierr = PetscSFView(sfMig, PETSC_VIEWER_STDOUT_WORLD); CHKERRQ(ierr);
  ierr = PetscSFCreateInverseSF(sfMig, &sfPart); CHKERRQ(ierr);
  ierr = PetscObjectSetName((PetscObject) sfPart, "Inverse Migration SF");CHKERRQ(ierr);
  ierr = PetscSFView(sfPart, PETSC_VIEWER_STDOUT_WORLD); CHKERRQ(ierr);

  //... to be continued....

  ierr = PetscSFDestroy(&sfMig);CHKERRQ(ierr);
  ierr = PetscSFDestroy(&sfPart);CHKERRQ(ierr);
  ierr = DMDestroy(&ddm);CHKERRQ(ierr);
  ierr = PetscFinalize();
  return ierr;
}

/*TEST

  testset:
    args: -field

    test:
      suffix: 0
      args:

    test:
      suffix: 1
      args: -box -dm_plex_simplex 0 -dm_plex_box_faces 7,10 -dm_distribute -dm_view hdf5:mesh.h5

TEST*/

Reply via email to