Before your pull request merged to main, I copied your change to my local /src/dm/impls/plex/plexgmsh.c to test pyramids.
Loading DMPlex with the mesh including pyramids works okay, but printing out the solution field to .vtk or .vtu format causes a problem with "Unknown Cell Type Error" from post-processing tools (such as tecplot or paraview). I had a look at /src/dm/impls/plex/plexvtk.c the function, DMPlexVTKGetCellType_Internal() seems not explicitly including a cellType marker for prism and pyramids. Probably, "VTK_WEDGE" is prism with triangular base, but still, there seems no pyramids marker. Thanks, > On Wed, Jul 13, 2022 at 11:18 AM Mike Michell <[email protected]> > wrote: > >> Attached is a mixed mesh file that I am testing. I cannot see any special >> marker for pyramid cells. Version of gmsh is 4.9.0. >> > > You are correct. Pyramids were disabled. I have activated them here: > > https://gitlab.com/petsc/petsc/-/merge_requests/5422 > > You mesh runs fine after this for me. I am not sure things like geometry > will work for pyramids. However, if you find something > broken, just let me know. > > Thanks, > > Matt > > >> Thanks, >> >>> >>> On Wed, Jul 13, 2022 at 10:41 AM Mike Michell <[email protected]> >>> wrote: >>> >>>> Thank you for the quick response. Below is the full error message I >>>> get. >>>> >>>> >>>> [0]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [0]PETSC ERROR: Argument out of range >>>> [0]PETSC ERROR: No face description for cell type unknown >>>> >>> >>> Here is the problem. The pyramid (or some other cell) was classified as >>> "unknown". First, make sure the Gmsh file is version 4.1. >>> If that fails, send the Gmsh file and I will try to figure out why the >>> cell is not coming up as a pyramid. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>> shooting. >>>> [0]PETSC ERROR: Petsc Release Version 3.17.0, unknown >>>> [0]PETSC ERROR: /home/Mike/Workspace/test on a named Mike Wed Jul 13 >>>> 10:38:31 2022 >>>> [0]PETSC ERROR: Configure options >>>> --prefix=/home/Mike/Library/petsc_partition/install-intel >>>> PETSC_ARCH=linux-gnu-intel --with-cc=mpiicc --with-cxx=mpiicpc >>>> --with-fc=mpiifort --download-fblaslapack --download-metis >>>> --download-parmetis --download-eigen --download-pragmatic --download-hdf5 >>>> --download-triangle --with-debugging=1 COPTFLAGS="-O3 -mtune=native" >>>> CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 >>>> [0]PETSC ERROR: #1 DMPlexGetRawFaces_Internal() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexinterpolate.c:312 >>>> [0]PETSC ERROR: #2 DMPlexInterpolateFaces_Internal() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexinterpolate.c:350 >>>> [0]PETSC ERROR: #3 DMPlexInterpolate() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexinterpolate.c:1327 >>>> [0]PETSC ERROR: #4 DMPlexCreateGmsh() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexgmsh.c:1634 >>>> [0]PETSC ERROR: #5 DMPlexCreateGmshFromFile() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexgmsh.c:1418 >>>> [0]PETSC ERROR: #6 DMPlexCreateFromFile() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexcreate.c:4721 >>>> [0]PETSC ERROR: #7 DMPlexCreateFromOptions_Internal() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexcreate.c:3212 >>>> [0]PETSC ERROR: #8 DMSetFromOptions_Plex() at >>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexcreate.c:3433 >>>> [0]PETSC ERROR: #9 DMSetFromOptions() at >>>> /home/Mike/Library/petsc_partition/src/dm/interface/dm.c:887 >>>> [0]PETSC ERROR: #10 User provided function() at User file:0 >>>> Abort(63) on node 0 (rank 0 in comm 16): application called >>>> MPI_Abort(MPI_COMM_SELF, 63) - process 0 >>>> >>>> Thanks, >>>> Mike >>>> >>>> >>>>> On Wed, Jul 13, 2022 at 10:30 AM Mike Michell <[email protected]> >>>>> wrote: >>>>> >>>>>> Hi, DMCreate() is used to load/distribute grid built from gmsh, and >>>>>> the function crashes when a mixed mesh of tetra and pyramids in 3D. It >>>>>> looks PETSc can handle the cell types of >>>>>> Tetra/Hexa/Prism/Pyramids/Polygon/Polyhedra in 3D. Thus I am unsure why >>>>>> it >>>>>> crashes when pyramids included in gmsh file. >>>>>> >>>>>> Below functions are used to mesh distribution. >>>>>> call DMCreate(PETSC_COMM_WORLD, dm_g, ierr);CHKERRA(ierr) >>>>>> call DMSetType(dm_g, DMPLEX, ierr);CHKERRA(ierr) >>>>>> call DMSetFromOptions(dm_g, ierr);CHKERRA(ierr) >>>>>> >>>>>> Below error messages got from run. >>>>>> >>>>> >>>>> I need the complete error message. This is only the stack. I cannot >>>>> see the error message, or the version of PETSc you are using. >>>>> >>>>> This should work, so I do not immediately know what is wrong. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> [0]PETSC ERROR: #1 DMPlexGetRawFaces_Internal() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexinterpolate.c:312 >>>>>> [0]PETSC ERROR: #2 DMPlexInterpolateFaces_Internal() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexinterpolate.c:350 >>>>>> [0]PETSC ERROR: #3 DMPlexInterpolate() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexinterpolate.c:1327 >>>>>> [0]PETSC ERROR: #4 DMPlexCreateGmsh() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexgmsh.c:1634 >>>>>> [0]PETSC ERROR: #5 DMPlexCreateGmshFromFile() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexgmsh.c:1418 >>>>>> [0]PETSC ERROR: #6 DMPlexCreateFromFile() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexcreate.c:4721 >>>>>> [0]PETSC ERROR: #7 DMPlexCreateFromOptions_Internal() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexcreate.c:3212 >>>>>> [0]PETSC ERROR: #8 DMSetFromOptions_Plex() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/impls/plex/plexcreate.c:3433 >>>>>> [0]PETSC ERROR: #9 DMSetFromOptions() at >>>>>> /home/Mike/Library/petsc_partition/src/dm/interface/dm.c:887 >>>>>> [0]PETSC ERROR: #10 User provided function() at User file:0 >>>>>> Abort(63) on node 0 (rank 0 in comm 16): application called >>>>>> MPI_Abort(MPI_COMM_SELF, 63) - process 0 >>>>>> >>>>>> Can I get any comments on that? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> <http://www.cse.buffalo.edu/~knepley/> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> <http://www.cse.buffalo.edu/~knepley/> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > <http://www.cse.buffalo.edu/~knepley/> >
