Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases

2022-11-14 Thread Junchao Zhang
Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < petsc-users@mcs.anl.gov> wrote: > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use > the COO interface for

Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases

2022-11-14 Thread Barry Smith
Mat of type (null) Either the entire matrix (header) data structure has gotten corrupted or the matrix type was never set. Can you run with valgrind to see if there is any memory corruption? > On Nov 14, 2022, at 1:24 PM, Fackler, Philip via petsc-users > wrote: > > In Xolotl's

[petsc-users] Using multiple MPI ranks with COO interface crashes in some cases

2022-11-14 Thread Fackler, Philip via petsc-users
In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a