It appears you are completely ignoring the vec argument? Take a look at, for example, DMCreateInterpolation_DA() you will see you need to provide an appropriate vec in the same way you need (and do) provide an appropriate mat.
Barry > On Dec 10, 2024, at 7:02 AM, Matteo Semplice via petsc-users > <petsc-users@mcs.anl.gov> wrote: > > Dear petsc-users, > > I am trying with a student to modify the MG example 65 to use mat-shells > instead of assembled matrices. Our use case is for a method that will use > custom shell operators and shell interpolation/restrictions. > > To start we have modified ex65 and tried to replace the standard > restriction/interpolation with a shell matrix that performs the same > operations on the DMDA grids. > > I attach our modifications to ex65. > > The problem is that the code sometimes completes execution and sometimes > errors out like > > $ ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 -ksp_rtol 1e-1 > >> Created DMshell 0x55bbc83c91a0 (0x55bbc8390a80) > Calling KSPSolve from main > computeRHS on grid 513 > computeMatrix on grid 513 > Inside Coarsen > >> Created DMshell 0x55bbc84c5270 (0x55bbc84b2ff0) > Inside Coarsen > >> Created DMshell 0x55bbc84e0a30 (0x55bbc84bdb60) > >> Create interpolation from 0x55bbc84c5270(0x55bbc84b2ff0) to > >> 0x55bbc83c91a0(0x55bbc8390a80) > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Invalid pointer > [0]PETSC ERROR: Invalid Pointer to PetscObject: Argument 'obj' (parameter # > 1) > [0]PETSC ERROR: See > https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ZLNRcAWCboFilZVAcU1gjUddxWbwjb25jRNK4CCZCz9YuJ22SpOf-bCIgcXbHcwhuzs4UiqEsjcnvKnre3OGa-c$ > > <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!Y29qRoOD71oxcd-BQFTUohkIToMJNW3puZ3CerpZ6pBnecqhBQ8TBwhSnikLt4BYQTmsBgtKvoCFi-JuLXQAE_Oe_a_d2XOL8KvOmw$> > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.22.0, unknown > [0]PETSC ERROR: ./ex65shell with 1 MPI process(es) and PETSC_ARCH on > signalkuppe by matteo Tue Dec 10 12:55:12 2024 > [0]PETSC ERROR: Configure options: > --prefix=/home/matteo/software/petscsaved/3.22-opt/ > PETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg --with-debugging=1 > --with-st > rict-petscerrorcode --download-hdf5 --download-ml --with-metis > --with-parmetis --with-gmsh --with-triangle --with-zlib > --with-p4est-dir=~/software/p4est/local/ > [0]PETSC ERROR: #1 PetscObjectReference() at > /home/matteo/software/petsc/src/sys/objects/inherit.c:620 > [0]PETSC ERROR: #2 PCMGSetRScale() at > /home/matteo/software/petsc/src/ksp/pc/impls/mg/mgfunc.c:394 > [0]PETSC ERROR: #3 PCSetUp_MG() at > /home/matteo/software/petsc/src/ksp/pc/impls/mg/mg.c:998 > [0]PETSC ERROR: #4 PCSetUp() at > /home/matteo/software/petsc/src/ksp/pc/interface/precon.c:1071 > [0]PETSC ERROR: #5 KSPSetUp() at > /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:415 > [0]PETSC ERROR: #6 KSPSolve_Private() at > /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:826 > [0]PETSC ERROR: #7 KSPSolve() at > /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:1075 > [0]PETSC ERROR: #8 main() at ../src/ex65shell.c:79 > [0]PETSC ERROR: PETSc Option Table entries: > [0]PETSC ERROR: -da_refine 2 (source: command line) > [0]PETSC ERROR: -ksp_monitor (source: command line) > [0]PETSC ERROR: -ksp_rtol 1e-1 (source: command line) > [0]PETSC ERROR: -pc_type mg (source: command line) > [0]PETSC ERROR: ----------------End of Error Message -------send entire error > message to petsc-ma...@mcs.anl.gov <mailto:petsc-ma...@mcs.anl.gov>---------- > > In all cases valgrind complains like > > $ valgrind ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 -ksp_rtol 1e-1 > -int_view ascii::ascii_info > ==2130767== Memcheck, a memory error detector > ==2130767== Copyright (C) 2002-2022, and GNU GPL'd, by Julian Seward et al. > ==2130767== Using Valgrind-3.19.0 and LibVEX; rerun with -h for copyright > info > ==2130767== Command: ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 > -ksp_rtol 1e-1 -int_view ascii::ascii_info > ==2130767== > hwloc x86 backend cannot work under Valgrind, disabling. > May be reenabled by dumping CPUIDs with hwloc-gather-cpuid > and reloading them under Valgrind with HWLOC_CPUID_PATH. > >> Created DMshell 0xfbf50f0 (0xfb49480) > Calling KSPSolve from main > computeRHS on grid 513 > computeMatrix on grid 513 > Inside Coarsen > >> Created DMshell 0xff404a0 (0xff062d0) > Inside Coarsen > >> Created DMshell 0xff78ca0 (0xff4a530) > >> Create interpolation from 0xff404a0(0xff062d0) to 0xfbf50f0(0xfb49480) > Mat Object: 1 MPI process > type: shell > rows=513, cols=257 > ==2130767== Conditional jump or move depends on uninitialised value(s) > ==2130767== at 0x83FDAAE: PCSetUp_MG (mg.c:998) > ==2130767== by 0x8620C04: PCSetUp (precon.c:1071) > ==2130767== by 0x7DA78D5: KSPSetUp (itfunc.c:415) > ==2130767== by 0x7DB2093: KSPSolve_Private (itfunc.c:826) > ==2130767== by 0x7DB7A53: KSPSolve (itfunc.c:1075) > ==2130767== by 0x10D2F7: main (ex65shell.c:79) > ==2130767== > ==2130767== Conditional jump or move depends on uninitialised value(s) > ==2130767== at 0x5794061: VecDestroy (vector.c:570) > ==2130767== by 0x83FDC36: PCSetUp_MG (mg.c:999) > ==2130767== by 0x8620C04: PCSetUp (precon.c:1071) > ==2130767== by 0x7DA78D5: KSPSetUp (itfunc.c:415) > ==2130767== by 0x7DB2093: KSPSolve_Private (itfunc.c:826) > ==2130767== by 0x7DB7A53: KSPSolve (itfunc.c:1075) > ==2130767== by 0x10D2F7: main (ex65shell.c:79) > ==2130767== > > We are clearly doing something wrong since PCSetUp_MG (mg.c:998) is an area > of code where I wouldn't expect we would enter. > > Can you advise on the proper way to achieve our goal? > > Best regards > > Matteo > > <ex65.patch>