On Tue 31. May 2022 at 16:28, Ye Changqing <ye_changq...@outlook.com> wrote:
> Dear developers of PETSc, > > I encountered a problem when using the DMStag module. The program could be > executed perfectly in serial, while errors are thrown out in parallel > (using mpiexec). Some rows in Mat cannot be accessed in local processes > when looping all elements in DMStag. The DM object I used only has one DOF > in each element. Hence, I could switch to the DMDA module easily, and the > program now is back to normal. > > Some snippets are below. > > Initialise a DMStag object: > PetscCall(DMStagCreate2d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, > DM_BOUNDARY_NONE, M, N, PETSC_DECIDE, PETSC_DECIDE, 0, 0, 1, > DMSTAG_STENCIL_BOX, 1, NULL, NULL, &(s_ctx->dm_P))); > Created a Mat: > PetscCall(DMCreateMatrix(s_ctx->dm_P, A)); > Loop: > PetscCall(DMStagGetCorners(s_ctx->dm_V, &startx, &starty, &startz, &nx, > &ny, &nz, &extrax, &extray, &extraz)); > for (ey = starty; ey < starty + ny; ++ey) > for (ex = startx; ex < startx + nx; ++ex) > { > ... > PetscCall(DMStagMatSetValuesStencil(s_ctx->dm_P, *A, 2, &row[0], 2, > &col[0], &val_A[0][0], ADD_VALUES)); // The traceback shows the problem is > in here. > } > In addition to the code or MWE, please forward us the complete stack trace / error thrown to stdout. Thanks, Dave > Best, > Changqing > >