Hi,

I just noticed that b6b5caf99979f50a7774afdccea5ca5661fc1203 Merge branch 'knepley/feature-hybrid-mass' into ‘main’ seems to have introduced a bug

See the attached example (a very simple update of src/dm/impls/plex/tests/ex98.c)
When calling DMGetMatrix, I get the following error:
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Need to call DMCreateDS() before calling DMGetDS()
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-dm_mat_view (no value) source: command line
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.5-1161-gb6b5caf9997  GIT Date: 2023-03-27 14:14:19 +0000
[0]PETSC ERROR: ./ex98 on a ventura-gcc12.2-arm64-g named sibookpro.home by blaise Thu Apr 20 21:14:40 2023
[0]PETSC ERROR: Configure options --CFLAGS="-Wimplicit-function-declaration -Wunused -Wuninitialized" --FFLAGS="-ffree-line-length-none -fallow-argument-mismatch -Wunused -Wuninitialized" --download-exodusii=1 --download-hdf5=1 --download-netcdf=1 --download-ml=1 --download-pnetcdf=1 --download-zlib=1 --with-debugging=1 --with-exodusii-fortran-bindings --with-shared-libraries=1 --with-x11=1
[0]PETSC ERROR: #1 DMGetDS() at /opt/HPC/petsc-main/src/dm/interface/dm.c:5419
[0]PETSC ERROR: #2 DMPlexPreallocateOperator() at /opt/HPC/petsc-main/src/dm/impls/plex/plexpreallocate.c:710
[0]PETSC ERROR: #3 DMCreateMatrix_Plex() at /opt/HPC/petsc-main/src/dm/impls/plex/plex.c:2662
[0]PETSC ERROR: #4 DMCreateMatrix() at /opt/HPC/petsc-main/src/dm/interface/dm.c:1476
[0]PETSC ERROR: #5 main() at ex98.c:92
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -dm_mat_view (source: command line)
[0]PETSC ERROR: -i /Users/blaise/Development/mef90/mef90-dmplex/TestMeshes/SquareFaceSet.msh (source: command line)
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-ma...@mcs.anl.gov----------
Abort(73) on node 0 (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 73) - process 0


Do I now need to use DS in order to use duplex and sections?

Regards,
Blaise



Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243

Attachment: ex98.c
Description: ex98.c

Reply via email to