Hello everyone,
I was interested in doing some experiments starting from step 64 and 
therefore I reinstalled Deal II with cuda support (using Spack).
Unfortunately, now step 64 works perfectly but many other examples that do 
not use cuda but need PETSc don't.
For example, if I run step-40 I get the following error:

mpirun -np 1 step-40.debug 
Running with PETSc on 1 MPI rank(s)... 
Cycle 0: 
Number of active cells: 1024 
Number of degrees of freedom: 4225 
[0]PETSC ERROR: --------------------- Error Message 
-------------------------------------------------------------- 
[0]PETSC ERROR: Invalid argument 
[0]PETSC ERROR: HYPRE_MEMORY_DEVICE expects a device vector. You need to 
enable PETSc device support, for example, in some cases, -vec_type cuda 
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. 
[0]PETSC ERROR: Petsc Release Version 3.17.1, Apr 28, 2022 
[0]PETSC ERROR: step-40.debug on a named Zorn by step Tue Dec 27 21:36:21 
2022 
[0]PETSC ERROR: Configure options 
--prefix=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku
 
--with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 
--wit
h-cc=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/openmpi-4.1.4-3ea3p3wv5a53ohy3fxui7yygr2m4nlrb/bin/mpicc
 
--with-cxx=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.
3.0/openmpi-4.1.4-3ea3p3wv5a53ohy3fxui7yygr2m4nlrb/bin/mpic++ 
--with-fc=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/openmpi-4.1.4-3ea3p3wv5a53ohy3fxui7yygr2m4nlrb/bin/mpif90
 
--with-precision=double --with-scala
r-type=real --with-shared-libraries=1 --with-debugging=0 --with-openmp=0 
--with-64-bit-indices=0 
--with-blaslapack-lib=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/openblas-0.3.21-ozedcu5tn56y23uoxrx6zjf2tnhncvy
g/lib/libopenblas.so --with-x=0 --with-clanguage=C --with-cuda=1 
--with-cuda-dir=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/cuda-11.8.0-xzhbnrb3zgd6zsrdjaa4an3saa7dkkf3
 
--with-hip=0 --with-metis=1 --with-metis
-include=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/metis-5.1.0-imnzuza4aintah4v6c3e3nciaux2opc7/include
 
--with-metis-lib=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/g
cc-11.3.0/metis-5.1.0-imnzuza4aintah4v6c3e3nciaux2opc7/lib/libmetis.so 
--with-hypre=1 
--with-hypre-include=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hypre-2.27.0-luxpg32fzxo2vgcnb5iq53lbrvxberiw/include
 
--wit
h-hypre-lib=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hypre-2.27.0-luxpg32fzxo2vgcnb5iq53lbrvxberiw/lib/libHYPRE.so
 
--with-parmetis=1 
--with-parmetis-include=/home/step/environments/dealii-9.4_gcc-11.3/spack/
opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/parmetis-4.0.3-uckybgzpowx5rdwtskefzomdapasgpoc/include
 
--with-parmetis-lib=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/parmetis-4.0.3-uckybgzpowx5rdwtskefzomdapas
gpoc/lib/libparmetis.so --with-kokkos=0 --with-kokkos-kernels=0 
--with-superlu_dist=1 
--with-superlu_dist-include=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/superlu-dist-8.1.2-qlmlkwfer4h53d7vis7lpjtcu77bp3yn/
include 
--with-superlu_dist-lib=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/superlu-dist-8.1.2-qlmlkwfer4h53d7vis7lpjtcu77bp3yn/lib/libsuperlu_dist.so
 
--with-ptscotch=0 --with-suitesparse=0 --with-hdf5=1 --with
-hdf5-include=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hdf5-1.10.7-xjt3eeewg3eihjdvjnzdavfpuy5mmpjr/include
 
--with-hdf5-lib="/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skyl
ake/gcc-11.3.0/hdf5-1.10.7-xjt3eeewg3eihjdvjnzdavfpuy5mmpjr/lib/libhdf5_hl_fortran.so
 
/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hdf5-1.10.7-xjt3eeewg3eihjdvjnzdavfpuy5mmpjr/lib/libhdf5_hl_f90cstub.so
 
/home/st
ep/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hdf5-1.10.7-xjt3eeewg3eihjdvjnzdavfpuy5mmpjr/lib/libhdf5_hl.so
 
/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hdf5-1.10.7-xj
t3eeewg3eihjdvjnzdavfpuy5mmpjr/lib/libhdf5_fortran.so 
/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hdf5-1.10.7-xjt3eeewg3eihjdvjnzdavfpuy5mmpjr/lib/libhdf5_f90cstub.so
 
/home/step/environments/dealii-9.4_gcc-11.3
/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/hdf5-1.10.7-xjt3eeewg3eihjdvjnzdavfpuy5mmpjr/lib/libhdf5.so"
 
--with-zlib=1 
--with-zlib-include=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/zlib-1.2.13-rcig5
et6juu4y42r6v5z6fvkakzygdox/include 
--with-zlib-lib=/home/step/environments/dealii-9.4_gcc-11.3/spack/opt/spack/linux-ubuntu22.04-skylake/gcc-11.3.0/zlib-1.2.13-rcig5et6juu4y42r6v5z6fvkakzygdox/lib/libz.so
 
--with-mumps=0 --with-trilinos=0 --with-fftw=0 --with-val
grind=0 --with-gmp=0 --with-libpng=0 --with-giflib=0 --with-mpfr=0 
--with-netcdf=0 --with-pnetcdf=0 --with-moab=0 --with-random123=0 
--with-exodusii=0 --with-cgns=0 --with-memkind=0 --with-p4est=0 
--with-saws=0 --with-yaml=0 --with-hwloc=0 --with-libjpeg=0 --with
-scalapack=0 --with-strumpack=0 --with-mmg=0 --with-parmmg=0 
--with-tetgen=0 --with-cuda-arch=61 
[0]PETSC ERROR: #1 VecGetArrayForHYPRE() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/vec/vec/impls/hypre/vhyp.c:94
 

[0]PETSC ERROR: #2 VecHYPRE_IJVectorPushVecRead() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/vec/vec/impls/hypre/vhyp.c:137
 

[0]PETSC ERROR: #3 PCApply_HYPRE() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/ksp/pc/impls/hypre/hypre.c:434
 

[0]PETSC ERROR: #4 PCApply() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/ksp/pc/interface/precon.c:432
 

[0]PETSC ERROR: #5 KSP_PCApply() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/include/petsc/private/kspimpl.h:376
 

[0]PETSC ERROR: #6 KSPSolve_CG() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/ksp/ksp/impls/cg/cg.c:135
 

[0]PETSC ERROR: #7 KSPSolve_Private() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/ksp/ksp/interface/itfunc.c:902
 

[0]PETSC ERROR: #8 KSPSolve() at 
/tmp/step/spack-stage/spack-stage-petsc-3.17.1-wzfeckr7omoetuquumz77nloe7u6mpku/spack-src/src/ksp/ksp/interface/itfunc.c:1078
 

--------------------------------------------------------- 
TimerOutput objects finalize timed values printed to the 
screen by communicating over MPI in their destructors. 
Since an exception is currently uncaught, this 
synchronization (and subsequent output) will be skipped 
to avoid a possible deadlock. 
--------------------------------------------------------- 


---------------------------------------------------- 
Exception on processing: 

-------------------------------------------------------- 
An error occurred in line <138> of file 
</home/step/environments/dealii-9.4_gcc-11.3/dealii-9.4.1-source/source/lac/petsc_solver.cc>
 
in function 
void dealii::PETScWrappers::SolverBase::solve(const 
dealii::PETScWrappers::MatrixBase&, dealii::PETScWrappers::VectorBase&, 
const dealii::PETScWrappers::VectorBase&, const 
dealii::PETScWrappers::PreconditionBase&) 
The violated condition was: 
ierr == 0 
Additional information: 
deal.II encountered an error while calling a PETSc function. 
The description of the error provided by PETSc is "Invalid argument". 
The numerical value of the original error code is 62. 

Stacktrace: 
----------- 
#0 
/home/step/environments/dealii-9.4_gcc-11.3/dealii-9.4.1-build/lib/libdeal_II.g.so.9.4.1:
 
dealii::PETScWrappers::SolverBase::solve(dealii::PETScWrappers::MatrixBase 
const&, dealii::PETScWrappers::VectorBase&, 
dealii::PETScWrappers::VectorBase const&, dealii::
PETScWrappers::PreconditionBase const&) 
#1 step-40.debug: Step40::LaplaceProblem<2>::solve() 
#2 step-40.debug: Step40::LaplaceProblem<2>::run() 
#3 step-40.debug: main 
-------------------------------------------------------- 

Aborting! 
---------------------------------------------------- 
-------------------------------------------------------------------------- 
Primary job terminated normally, but 1 process returned 
a non-zero exit code. Per user-direction, the job has been aborted. 
-------------------------------------------------------------------------- 
-------------------------------------------------------------------------- 
mpirun detected that one or more processes exited with non-zero status, 
thus causing 
the job to be terminated. The first process to do so was: 

Process name: [[46368,1],0] 
Exit code: 1 
--------------------------------------------------------------------------

I understand that probably the error is related to the fact that I never 
specified to PETSc that my vector should stay on my CPU, but I don't know 
how to do that. In PETSc there is a function that I could use, but I can 
not find an equivalent for the Deal II wrappers.
I am using Deal II 9.4.1, PETSc 3.17.1 and Hypre 2.27.0. Thank you!

Stefano

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/4df88ba0-6856-485e-a1ec-d0b236fc1c58n%40googlegroups.com.

Reply via email to