Hi Matt,

Thanks for the fixes to the example.

-Gautam

On Jan 15, 2020, at 7:05 PM, Matthew Knepley 
<[email protected]<mailto:[email protected]>> wrote:

On Wed, Jan 15, 2020 at 4:08 PM Matthew Knepley 
<[email protected]<mailto:[email protected]>> wrote:
On Wed, Jan 15, 2020 at 3:47 PM 'Bisht, Gautam' via tdycores-dev 
<[email protected]<mailto:[email protected]>> wrote:
Hi Matt,

I’m running into error while using DMPlexNaturalToGlobalBegin/End and am hoping 
you have some insights in what I’m doing incorrectly. I create a 2x2x2 grid and 
distribute it across processors (N=1,2). I create a natural and a global 
vector; and then call DMPlexNaturalToGlobalBegin/End. Here are the two issues:

- When N = 1, PETSc complains about DMSetUseNatural() not being called before 
DMPlexDistribute(), which is certainly not the case.
- For N=1 and 2, global vector doesn’t have valid entries.

I’m not sure how to create the natural vector and have used 
DMCreateGlobalVector() to create the natural vector, which could be the issue.

Attached is the sample code to reproduce the error and below is the screen 
output.

Cool. I will run it and figure out the problem.

1) There was bad error reporting there. I am putting the fix in a new branch. 
It did not check for being on one process. If you run with

       knepley/fix-dm-g2n-serial

    It will work correctly in serial.

2) The G2N needs a serial data layout to work, so you have to make a Section 
_before_ distributing. I need to put that in the docs. I have
     fixed your example to do this and attached it. I run it with

     master *:~/Downloads/tmp/Gautam$ /PETSc3/petsc/bin/mpiexec -n 1 ./ex_test 
-dm_plex_box_faces 2,2,2 -dm_view
DM Object: 1 MPI processes
  type: plex
DM_0x84000000_0 in 3 dimensions:
  0-cells: 27
  1-cells: 54
  2-cells: 36
  3-cells: 8
Labels:
  marker: 1 strata with value/size (1 (72))
  Face Sets: 6 strata with value/size (6 (4), 5 (4), 3 (4), 4 (4), 1 (4), 2 (4))
  depth: 4 strata with value/size (0 (27), 1 (54), 2 (36), 3 (8))
Field p:
  adjacency FVM++
Natural vector:

Vec Object: 1 MPI processes
  type: seq
0.
1.
2.
3.
4.
5.
6.
7.

Global vector:

Vec Object: 1 MPI processes
  type: seq
0.
1.
2.
3.
4.
5.
6.
7.

Information about the mesh:
[0] cell = 00; (0.250000, 0.250000, 0.250000); is_local = 1
[0] cell = 01; (0.750000, 0.250000, 0.250000); is_local = 1
[0] cell = 02; (0.250000, 0.750000, 0.250000); is_local = 1
[0] cell = 03; (0.750000, 0.750000, 0.250000); is_local = 1
[0] cell = 04; (0.250000, 0.250000, 0.750000); is_local = 1
[0] cell = 05; (0.750000, 0.250000, 0.750000); is_local = 1
[0] cell = 06; (0.250000, 0.750000, 0.750000); is_local = 1
[0] cell = 07; (0.750000, 0.750000, 0.750000); is_local = 1

master *:~/Downloads/tmp/Gautam$ /PETSc3/petsc/bin/mpiexec -n 2 ./ex_test 
-dm_plex_box_faces 2,2,2 -dm_view
DM Object: Parallel Mesh 2 MPI processes
  type: plex
Parallel Mesh in 3 dimensions:
  0-cells: 27 27
  1-cells: 54 54
  2-cells: 36 36
  3-cells: 8 8
Labels:
  depth: 4 strata with value/size (0 (27), 1 (54), 2 (36), 3 (8))
  marker: 1 strata with value/size (1 (72))
  Face Sets: 6 strata with value/size (1 (4), 2 (4), 3 (4), 4 (4), 5 (4), 6 (4))
Field p:
  adjacency FVM++
Natural vector:

Vec Object: 2 MPI processes
  type: mpi
Process [0]
0.
1.
2.
3.
Process [1]
4.
5.
6.
7.

Global vector:

Vec Object: 2 MPI processes
  type: mpi
Process [0]
2.
3.
6.
7.
Process [1]
0.
1.
4.
5.

Information about the mesh:
[0] cell = 00; (0.250000, 0.750000, 0.250000); is_local = 1
[0] cell = 01; (0.750000, 0.750000, 0.250000); is_local = 1
[0] cell = 02; (0.250000, 0.750000, 0.750000); is_local = 1
[0] cell = 03; (0.750000, 0.750000, 0.750000); is_local = 1
[0] cell = 04; (0.250000, 0.250000, 0.250000); is_local = 0
[0] cell = 05; (0.750000, 0.250000, 0.250000); is_local = 0
[0] cell = 06; (0.250000, 0.250000, 0.750000); is_local = 0
[0] cell = 07; (0.750000, 0.250000, 0.750000); is_local = 0
[1] cell = 00; (0.250000, 0.250000, 0.250000); is_local = 1
[1] cell = 01; (0.750000, 0.250000, 0.250000); is_local = 1
[1] cell = 02; (0.250000, 0.250000, 0.750000); is_local = 1
[1] cell = 03; (0.750000, 0.250000, 0.750000); is_local = 1
[1] cell = 04; (0.250000, 0.750000, 0.250000); is_local = 0
[1] cell = 05; (0.750000, 0.750000, 0.250000); is_local = 0
[1] cell = 06; (0.250000, 0.750000, 0.750000); is_local = 0

[1] cell = 07; (0.750000, 0.750000, 0.750000); is_local = 0

Thanks,

   Matt

  Thanks,

    Matt

>make ex_test

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>$PETSC_DIR/$PETSC_ARCH/bin/mpiexec -np 1 ./ex_test
Natural vector:

Vec Object: 1 MPI processes
  type: seq
0.
1.
2.
3.
4.
5.
6.
7.
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: DM global to natural SF was not created.
You must call DMSetUseNatural() before DMPlexDistribute().

[0]PETSC ERROR: See 
https://www.mcs.anl.gov/petsc/documentation/faq.html<https://protect2.fireeye.com/v1/url?k=49ab253e-151e1a87-49ab0f2b-0cc47adc5fce-24c0c8c46a909b39&q=1&e=e8c1de75-1fa2-4cdd-a1a0-86a8a8811e0a&u=https%3A%2F%2Fwww.mcs.anl.gov%2Fpetsc%2Fdocumentation%2Ffaq.html>
 for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.12.2-537-g5f77d1e0e5  GIT 
Date: 2019-12-21 14:33:27 -0600
[0]PETSC ERROR: ./ex_test on a darwin-gcc8 named WE37411 by bish218 Wed Jan 15 
12:34:03 2020
[0]PETSC ERROR: Configure options 
--with-blaslapack-lib=/System/Library/Frameworks/Accelerate.framework/Versions/Current/Accelerate
 --download-parmetis=yes --download-metis=yes --with-hdf5-dir=/opt/local 
--download-zlib --download-exodusii=yes --download-hdf5=yes 
--download-netcdf=yes --download-pnetcdf=yes --download-hypre=yes 
--download-mpich=yes --download-mumps=yes --download-scalapack=yes 
--with-cc=/opt/local/bin/gcc-mp-8 --with-cxx=/opt/local/bin/g++-mp-8 
--with-fc=/opt/local/bin/gfortran-mp-8 --download-sowing=1 
PETSC_ARCH=darwin-gcc8
[0]PETSC ERROR: #1 DMPlexNaturalToGlobalBegin() line 289 in 
/Users/bish218/projects/petsc/petsc_v3.12.2/src/dm/impls/plex/plexnatural.c

Global vector:

Vec Object: 1 MPI processes
  type: seq
0.
0.
0.
0.
0.
0.
0.
0.

Information about the mesh:

Rank = 0
local_id = 00; (0.250000, 0.250000, 0.250000); is_local = 1
local_id = 01; (0.750000, 0.250000, 0.250000); is_local = 1
local_id = 02; (0.250000, 0.750000, 0.250000); is_local = 1
local_id = 03; (0.750000, 0.750000, 0.250000); is_local = 1
local_id = 04; (0.250000, 0.250000, 0.750000); is_local = 1
local_id = 05; (0.750000, 0.250000, 0.750000); is_local = 1
local_id = 06; (0.250000, 0.750000, 0.750000); is_local = 1
local_id = 07; (0.750000, 0.750000, 0.750000); is_local = 1

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

>$PETSC_DIR/$PETSC_ARCH/bin/mpiexec -np 2 ./ex_test
Natural vector:

Vec Object: 2 MPI processes
  type: mpi
Process [0]
0.
1.
2.
3.
Process [1]
4.
5.
6.
7.

Global vector:

Vec Object: 2 MPI processes
  type: mpi
Process [0]
0.
0.
0.
0.
Process [1]
0.
0.
0.
0.

Information about the mesh:

Rank = 0
local_id = 00; (0.250000, 0.750000, 0.250000); is_local = 1
local_id = 01; (0.750000, 0.750000, 0.250000); is_local = 1
local_id = 02; (0.250000, 0.750000, 0.750000); is_local = 1
local_id = 03; (0.750000, 0.750000, 0.750000); is_local = 1
local_id = 04; (0.250000, 0.250000, 0.250000); is_local = 0
local_id = 05; (0.750000, 0.250000, 0.250000); is_local = 0
local_id = 06; (0.250000, 0.250000, 0.750000); is_local = 0
local_id = 07; (0.750000, 0.250000, 0.750000); is_local = 0

Rank = 1
local_id = 00; (0.250000, 0.250000, 0.250000); is_local = 1
local_id = 01; (0.750000, 0.250000, 0.250000); is_local = 1
local_id = 02; (0.250000, 0.250000, 0.750000); is_local = 1
local_id = 03; (0.750000, 0.250000, 0.750000); is_local = 1
local_id = 04; (0.250000, 0.750000, 0.250000); is_local = 0
local_id = 05; (0.750000, 0.750000, 0.250000); is_local = 0
local_id = 06; (0.250000, 0.750000, 0.750000); is_local = 0
local_id = 07; (0.750000, 0.750000, 0.750000); is_local = 0

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++


-Gautam




On Jan 9, 2020, at 4:57 PM, 'Bisht, Gautam' via tdycores-dev 
<[email protected]<mailto:[email protected]>> wrote:



On Jan 9, 2020, at 4:25 PM, Matthew Knepley 
<[email protected]<mailto:[email protected]>> wrote:

On Thu, Jan 9, 2020 at 1:35 PM 'Bisht, Gautam' via tdycores-dev 
<[email protected]<mailto:[email protected]>> wrote:

> On Jan 9, 2020, at 2:58 PM, Jed Brown 
> <[email protected]<mailto:[email protected]>> wrote:
>
> "'Bisht, Gautam' via tdycores-dev" 
> <[email protected]<mailto:[email protected]>> writes:
>
>>> Do you need to rely on the element number, or would coordinates (of a
>>> centroid?) be sufficient for your purposes?
>>
>> I do need to rely on the element number.  In my case, I have a mapping file 
>> that remaps data from one grid onto another grid. Though I’m currently 
>> creating a hexahedron mesh, in the future I would be reading in an 
>> unstructured grid from a file for which I cannot rely on coordinates.
>
> How does the mapping file work and how is it generated?

In CESM/E3SM, the mapping file is used to map fluxes or states between grids of 
two components (e.g. land & atmosphere). The mapping method can be 
conservative, nearest neighbor, bilinear, etc. While CESM/E3SM uses 
ESMF_RegridWeightGen to generate the mapping file, I’m using by own MATLAB 
script to create the mapping file.

I’m surprised that this is not an issue for other codes that are using DMPlex. 
E.g In PFLOTRAN, when a user creates a custom unstructured grid, they can 
specify material property for each grid cell. So, there should be a way to 
create a vectorscatter that will scatter material property read in the 
“application”-order (i.e. order before calling DMPlexDistribute() ) to 
ghosted-order (i.e. order after calling DMPlexDistribute()).

We did build something specific for this because some people wanted it. I wish 
I could purge this from all simulations. Its
definitely destructive, but this is the way the world currently is.

You want this:

  
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexNaturalToGlobalBegin.html<https://protect2.fireeye.com/v1/url?k=5fd6bd0b-036383c4-5fd6971e-0cc47adc5e60-2156f2075f6da02c&q=1&e=986b333d-b320-4f42-9c3c-fd4c4b3e0e40&u=https%3A%2F%2Fwww.mcs.anl.gov%2Fpetsc%2Fpetsc-current%2Fdocs%2Fmanualpages%2FDMPLEX%2FDMPlexNaturalToGlobalBegin.html>

Perfect.

Thanks.
-Gautam



    Thanks,

     Matt

> We can locate points and create interpolation with unstructured grids.
>
> --
> You received this message because you are subscribed to the Google Groups 
> "tdycores-dev" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to 
> [email protected]<mailto:tdycores-dev%[email protected]>.
> To view this discussion on the web visit 
> https://protect2.fireeye.com/v1/url?k=b265c01b-eed0fed4-b265ea0e-0cc47adc5e60-1707adbf1790c7e4&q=1&e=0962f8e1-9155-4d9c-abdf-2b6481141cd0&u=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Ftdycores-dev%2F8736come4e.fsf%2540jedbrown.org.

--
You received this message because you are subscribed to the Google Groups 
"tdycores-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:tdycores-dev%[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/tdycores-dev/9AB001AF-8857-446A-AE69-E8D6A25CB8FA%40pnnl.gov<https://protect2.fireeye.com/v1/url?k=711264f2-2da75a3d-71124ee7-0cc47adc5e60-a1731ca51d9b5203&q=1&e=4ec2bd09-0f1d-461c-bde7-4a0d8446ce3d&u=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Ftdycores-dev%2F9AB001AF-8857-446A-AE69-E8D6A25CB8FA%2540pnnl.gov>.


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>

--
You received this message because you are subscribed to the Google Groups 
"tdycores-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/tdycores-dev/CAMYG4Gm%3DSY%3DyDiYOdBm1j_KZO5NYhu80ZhbFTV23O%2Bv-zVvFnA%40mail.gmail.com<https://protect2.fireeye.com/v1/url?k=ce43b754-92f6899b-ce439d41-0cc47adc5e60-0a864c0f80b4c75d&q=1&e=4ec2bd09-0f1d-461c-bde7-4a0d8446ce3d&u=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Ftdycores-dev%2FCAMYG4Gm%253DSY%253DyDiYOdBm1j_KZO5NYhu80ZhbFTV23O%252Bv-zVvFnA%2540mail.gmail.com%3Futm_medium%3Demail%26utm_source%3Dfooter>.


--
You received this message because you are subscribed to the Google Groups 
"tdycores-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/tdycores-dev/7C23ABBA-2F76-4EAB-9834-9391AD77E18B%40pnnl.gov<https://protect2.fireeye.com/v1/url?k=365c6512-6ae95bdd-365c4f07-0cc47adc5e60-f9b3b6f5e8918ac5&q=1&e=986b333d-b320-4f42-9c3c-fd4c4b3e0e40&u=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Ftdycores-dev%2F7C23ABBA-2F76-4EAB-9834-9391AD77E18B%2540pnnl.gov%3Futm_medium%3Demail%26utm_source%3Dfooter>.


--
You received this message because you are subscribed to the Google Groups 
"tdycores-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/tdycores-dev/8A7925AE-08F5-4F81-AAA5-B2FDC3D833B0%40pnnl.gov<https://protect2.fireeye.com/v1/url?k=d941301d-85f40fa4-d9411a08-0cc47adc5fce-ae3accbb18dd8e05&q=1&e=e8c1de75-1fa2-4cdd-a1a0-86a8a8811e0a&u=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Ftdycores-dev%2F8A7925AE-08F5-4F81-AAA5-B2FDC3D833B0%2540pnnl.gov%3Futm_medium%3Demail%26utm_source%3Dfooter>.


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>

--
You received this message because you are subscribed to the Google Groups 
"tdycores-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/tdycores-dev/CAMYG4Gn%3DxsVjjN8sX6km8ub%3Djkk8vxiU2DZVEi-4Kpbi_rM-0w%40mail.gmail.com<https://protect2.fireeye.com/v1/url?k=bae87786-e65d483f-bae85d93-0cc47adc5fce-c50576531e121267&q=1&e=e8c1de75-1fa2-4cdd-a1a0-86a8a8811e0a&u=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Ftdycores-dev%2FCAMYG4Gn%253DxsVjjN8sX6km8ub%253Djkk8vxiU2DZVEi-4Kpbi_rM-0w%2540mail.gmail.com%3Futm_medium%3Demail%26utm_source%3Dfooter>.
<ex_test.c>

Reply via email to