Hi Matt,
I use the latest Fortran style in PETSc 3.8. Enclosed are the PETSc
configuration, code compiling log and the function that causes compiling
error. The compiling error happens after I include petscsf.h in the
following section. I didn't find petscsf.h in petsc/finclude/ folder so
I use the head file in the 'include' folder and this seems not allowed.
#ifdef PETSC_V3_8_X
#include <petsc/finclude/petscsys.h>
#include <petsc/finclude/petscdmplex.h>
#include <petsc/finclude/petscdmlabel.h>
#include <petscsf.h>
use petscsys
use petscdmplex
use petscsf
#endif
Thanks,
Danyang
On 18-03-02 12:08 PM, Matthew Knepley wrote:
On Fri, Mar 2, 2018 at 3:00 PM, Danyang Su <danyang...@gmail.com
<mailto:danyang...@gmail.com>> wrote:
On 18-03-02 10:58 AM, Matthew Knepley wrote:
On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su <danyang...@gmail.com
<mailto:danyang...@gmail.com>> wrote:
On 18-02-19 03:30 PM, Matthew Knepley wrote:
On Mon, Feb 19, 2018 at 3:11 PM, Danyang Su
<danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote:
Hi Matt,
Would you please let me know how to check if a cell is
local owned? When overlap is 0 in DMPlexDistribute, all
the cells are local owned. How about overlap > 0? It
sounds like impossible to check by node because a cell
can be local owned even if none of the nodes in this
cell is local owned.
If a cell is in the PetscSF, then it is not locally owned.
The local nodes in the SF are sorted, so I use
PetscFindInt
(http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscFindInt.html
<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscFindInt.html>).
Hi Matt,
Would you please give me a little more about how to mark the
ghost cells when overlap > 0? What do you mean a cell is in
the PetscSF? I use PetscSFView to export the graph (original
mesh file pile.vtk) and it exports all the cells, including
the ghost cells (PETScSFView.txt).
Yes, I will send you some sample code when I get time. The first
problem is that you are looking at a different PetscSF. This
looks like the
one returned by DMPlexDistribute(). This is mapping the serial
mesh to the parallel mesh. You want
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMGetPointSF.html
<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMGetPointSF.html>
Then you can look at
https://bitbucket.org/petsc/petsc/src/1788fc36644e622df8cb1a0de85676ccc5af0239/src/dm/impls/plex/plexsubmesh.c?at=master&fileviewer=file-view-default#plexsubmesh.c-683
<https://bitbucket.org/petsc/petsc/src/1788fc36644e622df8cb1a0de85676ccc5af0239/src/dm/impls/plex/plexsubmesh.c?at=master&fileviewer=file-view-default#plexsubmesh.c-683>
I get the pointSF, get out the list of leaves, and find points in
it using PetscFindInt()
Hi Matt,
By using the local dm, I can get the PetscSF I want, as shown
below. Now I need to get the number of ghost cells or local cells
(here 4944) or number of leaves (here 825) for each processor. I
try to use PetscSFGetGraph to get number of leaves in Fortran.
After including "petscsf.h", I got compilation error saying "You
need a ISO C conforming compiler to use the glibc headers". Is
there any alternative way to do this? I do not need the
ghost-neighbor mapping, but just the number of local owned cells.
Also, make sure you are using the latest Fortran style for PETSc:
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/UsingFortran.html
[0] Number of roots=11449, leaves=825, remote ranks=1
[0] 4944 <- (1,0)
[0] 4945 <- (1,28)
[0] 4946 <- (1,56)
...
[1] Number of roots=11695, leaves=538, remote ranks=1
[1] 5056 <- (0,21)
[1] 5057 <- (0,43)
[1] 5058 <- (0,65)
[1] 5059 <- (0,87)
In file included from /usr/include/features.h:375:0,
from /usr/include/stdio.h:28,
from
/home/dsu/Soft/PETSc/petsc-3.8.3/include/petscsys.h:175,
from
/home/dsu/Soft/PETSc/petsc-3.8.3/include/petscsf.h:7,
from ../../solver/solver_ddmethod.F90:4837:
/usr/include/x86_64-linux-gnu/sys/cdefs.h:30:3: error: #error "You
need a ISO C conforming compile\
r to use the glibc headers"
# error "You need a ISO C conforming compiler to use the glibc
headers"
Can you send this to petsc-ma...@mcs.anl.gov
<mailto:petsc-ma...@mcs.anl.gov>? It looks like a build problem that
can be fixed.
Thanks,
Matt
Thanks,
Danyang
Thanks,
Matt
Thanks,
Danyang
Thanks,
Matt
Thanks,
Danyang
--
What most experimenters take for granted before they begin
their experiments is infinitely more interesting than any
results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/
<http://www.caam.rice.edu/%7Emk51/>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to
which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/
<http://www.caam.rice.edu/%7Emk51/>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/%7Emk51/>
PETSC configuration
--with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mumps --download-scalapack --download-parmetis --download-metis --download-ptscotch --download-fblaslapack --download-mpich --download-hypre --download-superlu_dist --download-hdf5=yes --download-ctetgen --with-debugging=1
/home/dsu/Soft/PETSc/petsc-3.8.3/linux-gnu-dbg/bin/mpif90 -c -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -I/home/dsu/Soft/lis/lis-1.7.36/lis-mpi/include -I/home/dsu/Soft/PETSc/petsc-3.8.3/include -I/home/dsu/Soft/PETSc/petsc-3.8.3/linux-gnu-dbg/include -DLINUX -DDEBUG -DDEBUG_X64 -DSTANDARD_FORTRAN -DUSG -DPETSC -DMPI -DPETSC_V3_8_X -DOUTPUT_DOUBLE -DPETSC_HAVE_MUMPS -DPETSC_HAVE_SUPERLU -DLIS -o ../../solver/solver_ddmethod.o ../../solver/solver_ddmethod.F90
In file included from /usr/include/features.h:375:0,
from /usr/include/stdio.h:28,
from /home/dsu/Soft/PETSc/petsc-3.8.3/include/petscsys.h:175,
from /home/dsu/Soft/PETSc/petsc-3.8.3/include/petscsf.h:7,
from ../../solver/solver_ddmethod.F90:4837:
/usr/include/x86_64-linux-gnu/sys/cdefs.h:30:3: error: #error "You need a ISO C conforming compiler to use the glibc headers"
# error "You need a ISO C conforming compiler to use the glibc headers"
^
In file included from /usr/include/features.h:399:0,
from /usr/include/stdio.h:28,
from /home/dsu/Soft/PETSc/petsc-3.8.3/include/petscsys.h:175,
from /home/dsu/Soft/PETSc/petsc-3.8.3/include/petscsf.h:7,
from ../../solver/solver_ddmethod.F90:4837:
/usr/include/x86_64-linux-gnu/gnu/stubs.h:7:0: fatal error: gnu/stubs-32.h: No such file or directory
# include <gnu/stubs-32.h>
^
compilation terminated.
subroutine solver_dd_create_dmplex
#ifdef USG
#ifdef PETSC_V3_8_X
#include <petsc/finclude/petscsys.h>
#include <petsc/finclude/petscdmplex.h>
#include <petsc/finclude/petscdmlabel.h>
#include <petscsf.h>
use petscsys
use petscdmplex
use petscsf
#else
#ifndef PETSC_V3_7_X
#ifndef PETSC_V3_6_X
#ifndef PETSC_V3_5_X
#ifndef PETSC_V3_4_X
#include <petsc/finclude/petscsys.h>
#include <petsc/finclude/petscdmplex.h>
#include <petsc/finclude/petscdmlabel.h>
#include <petscsf.h>
use petscsys
use petscdmplex
use petscsf
#endif
#endif
#endif
#endif
#endif
use geometry_definition
use gen, only : ilog, idbg, rank, str_rank, b_enable_output
use usg_mesh_data, only : num_nodes, num_cells, nodes, cells, &
num_nodes_per_cell, cell_projection
use file_unit, only : lun_get, lun_free
use petsc_mpi_common, only : petsc_mpi_finalize
implicit none
#ifdef PETSC_V3_7_X
#include <petsc/finclude/petscsys.h>
#include <petsc/finclude/petscdmplex.h>
#include <petscsf.h>
#elif PETSC_V3_6_X
#include <petsc/finclude/petscsys.h>
#include <petsc/finclude/petscdmplex.h>
#include <petscsf.h>
#elif PETSC_V3_5_X
#include <finclude/petscsys.h>
#include <finclude/petscdmplex.h>
#include <petscsf.h>
#elif PETSC_V3_4_X
#include <finclude/petscsys.h>
#include <finclude/petscdmplex.h>
#include <petscsf.h>
#endif
!c local variables
integer :: icell, inode, icnt, ifile, info_debug
integer, allocatable :: dmplex_cells(:)
real*8, allocatable :: dmplex_verts(:)
character(256) :: strtitle
PetscInt :: ipoint, istart, iend, ivalue, ndim
PetscInt :: nroots,nleaves
DM :: distributedMesh, cda
PetscSection :: cs
PetscSF :: sf
PetscErrorCode :: ierr
info_debug = 0
!c note: for 2D mesh in 3D, convert coordinates to 2D
!c before passing to PETSc DMPlex
if (cell_projection == projection_xyz) then
ndim = 3
else
ndim = 2
end if
dmda_flow%dim = ndim
if (rank == 0) then
allocate(dmplex_cells(num_cells*num_nodes_per_cell))
allocate(dmplex_verts(num_nodes*ndim))
do icell = 1, num_cells
do inode = 1, num_nodes_per_cell
icnt = (icell-1)*num_nodes_per_cell+inode
dmplex_cells(icnt) = cells(inode,icell)-1
end do
end do
if (cell_projection == projection_xyz) then
do inode = 1, num_nodes
icnt = (inode-1)*ndim
dmplex_verts(icnt + 1) = nodes(inode)%x
dmplex_verts(icnt + 2) = nodes(inode)%y
dmplex_verts(icnt + 3) = nodes(inode)%z
end do
else if (cell_projection == projection_xy) then
do inode = 1, num_nodes
icnt = (inode-1)*ndim
dmplex_verts(icnt + 1) = nodes(inode)%x
dmplex_verts(icnt + 2) = nodes(inode)%y
end do
else if (cell_projection == projection_yz) then
do inode = 1, num_nodes
icnt = (inode-1)*ndim
dmplex_verts(icnt + 1) = nodes(inode)%y
dmplex_verts(icnt + 2) = nodes(inode)%z
end do
else if (cell_projection == projection_xz) then
do inode = 1, num_nodes
icnt = (inode-1)*ndim
dmplex_verts(icnt + 1) = nodes(inode)%x
dmplex_verts(icnt + 2) = nodes(inode)%z
end do
end if
!c create DMPlex from cell list
call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,num_cells, &
num_nodes,num_nodes_per_cell, &
Petsc_False,dmplex_cells,ndim, & !use Petsc_True to create intermediate mesh entities (faces, edges),
dmplex_verts,dmda_flow%da,ierr) !not work for prism for the current 3.8 version.
CHKERRQ(ierr)
!c create DMPlex from Gmsh file
!call DMPlexCreateGmshFromFile(Petsc_Comm_World,"stripf.msh", &
! Petsc_False, dmda_flow%da, ierr)
!CHKERRQ(ierr)
call DMGetCoordinateDM(dmda_flow%da,cda,ierr)
CHKERRQ(ierr)
call DMGetDefaultSection(cda,cs,ierr)
CHKERRQ(ierr)
call PetscSectionGetChart(cs,istart,iend,ierr)
CHKERRQ(ierr)
!c add local to global cell id mapping
do ipoint = 0, istart-1
#ifdef PETSC_V3_8_X
call DMSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
#elif PETSC_V3_7_X
call DMSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
#elif PETSC_V3_6_X
call DMPlexSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
#elif PETSC_V3_5_X
call DMPlexSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
#elif PETSC_V3_4_X
call DMPlexSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
#else
call DMSetLabelValue(dmda_flow%da,"cid_lg2g",ipoint, &
ipoint+1,ierr)
#endif
CHKERRQ(ierr)
end do
!c add local to global vertex id mapping
do ipoint = istart, iend-1
#ifdef PETSC_V3_8_X
call DMSetLabelValue(dmda_flow%da,"vid_lg2g",ipoint, &
ipoint-istart+1,ierr)
#elif PETSC_V3_7_X
call DMSetLabelValue(dmda_flow%da,"vid_lg2g",ipoint, &
ipoint-istart+1,ierr)
#elif PETSC_V3_6_X
call DMPlexSetLabelValue(dmda_flow%da,"vid_lg2g",ipoint, &
ipoint-istart+1,ierr)
#elif PETSC_V3_5_X
call DMPlexSetLabelValue(dmda_flow%da,"vid_lg2g",ipoint, &
ipoint-istart+1,ierr)
#elif PETSC_V3_4_X
call DMPlexSetLabelValue(dmda_flow%da,"vid_lg2g",ipoint, &
ipoint-istart+1,ierr)
#else
call DMSetLabelValue(dmda_flow%da,"vid_lg2g",ipoint, &
ipoint-istart+1,ierr)
#endif
CHKERRQ(ierr)
end do
deallocate(dmplex_cells)
deallocate(dmplex_verts)
else
call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,0, &
num_nodes_per_cell, &
Petsc_False,dmplex_cells,ndim, & !use Petsc_True to create intermediate mesh entities (faces, edges),
dmplex_verts,dmda_flow%da,ierr) !not work for prism for the current 3.8 version.
CHKERRQ(ierr)
end if
#ifdef DEBUG
if (rank == 0 .and. b_enable_output) then
write(*,'(a)') "Create DMPlex from cell list, done"
end if
#endif
!c distribute mesh over processes
call DMPlexDistribute(dmda_flow%da,stencil_width, &
#ifdef PETSC_V3_8_X
PETSC_NULL_SF, &
#elif PETSC_V3_7_X
PETSC_NULL_OBJECT, &
#elif PETSC_V3_6_X
PETSC_NULL_OBJECT, &
#elif PETSC_V3_5_X
PETSC_NULL_OBJECT, &
#elif PETSC_V3_4_X
PETSC_NULL_OBJECT, &
#else
PETSC_NULL_SF, &
#endif
distributedMesh,ierr)
CHKERRQ(ierr)
#ifdef DEBUG
if (rank == 0 .and. b_enable_output) then
write(*,'(a)') "Distribute DMPlex from cell list, done"
end if
#endif
!c destroy original global mesh after distribution
#ifdef PETSC_V3_8_X
if (distributedMesh /= PETSC_NULL_DM) then
#elif PETSC_V3_7_X
if (distributedMesh /= PETSC_NULL_OBJECT) then
#elif PETSC_V3_6_X
if (distributedMesh /= PETSC_NULL_OBJECT) then
#elif PETSC_V3_5_X
if (distributedMesh /= PETSC_NULL_OBJECT) then
#elif PETSC_V3_4_X
if (distributedMesh /= PETSC_NULL_OBJECT) then
#else
if (distributedMesh /= PETSC_NULL_DM) then
#endif
call DMDestroy(dmda_flow%da,ierr)
CHKERRQ(ierr)
!c set the global mesh as distributed mesh
dmda_flow%da = distributedMesh
end if
call DMGetPointSF(dmda_flow%da,sf,ierr)
CHKERRQ(ierr)
call PetscSFGetGraph(sf,nroots,nleaves,PETSC_NULL_INTEGER, &
PETSC_NULL_INTEGER,ierr)
CHKERRQ(ierr)
write(*,*) "rank",rank,"nroots",nroots,"nleaves",nleaves
!call PetscSFView(sf,PETSC_VIEWER_STDOUT_WORLD,ierr)
!CHKERRQ(ierr)
write(*,*) "End of PetscSFView"
#endif
end subroutine solver_dd_create_dmplex