If I do, the generated <binaryoutput.info> file is empty and the <binaryoutput> just states the solvers...

cat binaryoutput:
{Wgmres{Vilu

Do I have an error in my Petscbuild?
This by the way happens with all matrices, even tried a couple of examples from petsc src.

Anyway, I uploaded the matrix dump (-mat_view binary)
I hope this helps.
<http://homepages.physik.uni-muenchen.de/~Fabian.Jakub/>

To be thorough, I further put up a second file, with 2 matrices,
The second matrix is the same as before (the one that is not working with gamg). The first one with dof 8 is solved and the solution is somewhat the source term for the second. I just thought I would append this one too because it does indeed work with gamg.
Maybe that gives any hints.

I really appreciate you looking into this.

Thank you!

Sincerely,
Fabian



Am 09.07.2014 07:37, schrieb Barry Smith:
   Can you run with -ksp_view_pre binary and email the file binaryoutput

    With this we should be able to reproduce the problem and debug it.

    Thanks


   Barry

On Jul 8, 2014, at 10:51 AM, Fabian.Jakub <[email protected]> 
wrote:

Thank you very much for the swift reply!

I am running
git describe: v3.5-25-g0ace994
and configured with

./configure                           \
  --with-cc=$(which mpicc)            \
  --with-fc=$(which mpif90)           \
  --with-cxx=$(which mpic++)          \
  --with-fortran                      \
  --with-shared-libraries             \
  --with-hdf5                         \
  --with-hdf5-dir="$HDF5_DIR"         \
  --with-cmake=$(which cmake)         \
  \

I run the model with:
mpirun -np 2 -map-by socket bin/petsc_solver -pc_type gamg 
-pc_gamg_agg_nsmooths 1 -pc_gamg_verbose 1 -info

Should I append the output when creating the DMDA?
For the KSP setup it says:

[0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter
[0] VecScatterCreate(): General case: MPI to Seq
[1] MatAssemblyEnd_SeqAIJ(): Matrix size: 6359040 X 131520; storage space: 0 
unneeded,420864 used
[1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 8
[1] MatCheckCompressedRow(): Found the ratio (num_zerorows 6227520)/(num_localrows 
6359040) > 0.6. Use CompressedRow routines.
[0] MatAssemblyEnd_SeqAIJ(): Matrix size: 6359040 X 131520; storage space: 0 
unneeded,420864 used
[0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0
[0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 8
[0] MatCheckCompressedRow(): Found the ratio (num_zerorows 6227520)/(num_localrows 
6359040) > 0.6. Use CompressedRow routines.
[1] PetscCommDuplicate(): Using internal PETSc communicator 47323925895584 
15967792
Setup KSP
[0] PetscCommDuplicate(): Using internal PETSc communicator 47292596238752 
42057616
[0] PCSetUp(): Setting up PC for first time     [0]PCSetUp_GAMG level 0 
N=12718080, n data rows=10, n data cols=10, nnz/row (ave)=10, np=2
[1] PetscCommDuplicate(): Using internal PETSc communicator 47323925894432 
16179040
[0] PetscCommDuplicate(): Using internal PETSc communicator 47292596237600 
41749680
[1] PetscCommDuplicate(): Using internal PETSc communicator 47323925894432 
16179040
[0] PetscCommDuplicate(): Using internal PETSc communicator 47292596237600 
41749680
[1]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: New nonzero at (4704,4609) caused a malloc
Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off 
this check
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.5-25-g0ace994 GIT Date: 
2014-07-07 14:50:15 +0200
[1]PETSC ERROR: bin/petsc_solver on a debug named lx001 by jakub Tue Jul  8 
17:47:31 2014
[1]PETSC ERROR: Configure options 
--with-cc=/home/opt/cosmo_tica_lib/ompi1.8.1/openmpi-1.8.1/install/bin/mpicc 
--with-fc=/home/opt/cosmo_tica_lib/ompi1.8.1/openmpi-1.8.1/install/bin/mpif90 
--with-cxx=/home/opt/cosmo_tica_lib/ompi1.8.1/openmpi-1.8.1/install/bin/mpic++ 
--with-fortran --with-shared-libraries --with-hdf5 
--with-hdf5-dir=/home/opt/cosmo_tica_lib//ompi1.8.1/hdf5/HDF_Group/HDF5/1.8.13/ 
--with-cmake=/usr/bin/cmake
[1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 530 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: #2 MatSetValues() line 1136 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/mat/interface/matrix.c
[1]PETSC ERROR: #3 PCGAMGCreateGraph() line 72 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/impls/gamg/tools.c
[1]PETSC ERROR: #4 PCGAMGgraph_AGG() line 936 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/impls/gamg/agg.c
[1]PETSC ERROR: #5 PCSetUp_GAMG() line 595 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/impls/gamg/gamg.c
[1]PETSC ERROR: #6 PCSetUp() line 902 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/interface/precon.c
[1]PETSC ERROR: #7 KSPSetUp() line 305 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: New nonzero at (4704,4609) caused a malloc
Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off 
this check
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.5-25-g0ace994 GIT Date: 
2014-07-07 14:50:15 +0200
[0]PETSC ERROR: bin/petsc_solver on a debug named lx001 by jakub Tue Jul  8 
17:47:31 2014
[0]PETSC ERROR: Configure options 
--with-cc=/home/opt/cosmo_tica_lib/ompi1.8.1/openmpi-1.8.1/install/bin/mpicc 
--with-fc=/home/opt/cosmo_tica_lib/ompi1.8.1/openmpi-1.8.1/install/bin/mpif90 
--with-cxx=/home/opt/cosmo_tica_lib/ompi1.8.1/openmpi-1.8.1/install/bin/mpic++ 
--with-fortran --with-shared-libraries --with-hdf5 
--with-hdf5-dir=/home/opt/cosmo_tica_lib//ompi1.8.1/hdf5/HDF_Group/HDF5/1.8.13/ 
--with-cmake=/usr/bin/cmake
[0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 530 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: #2 MatSetValues() line 1136 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/mat/interface/matrix.c
[0]PETSC ERROR: #3 PCGAMGCreateGraph() line 72 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/impls/gamg/tools.c
[0]PETSC ERROR: #4 PCGAMGgraph_AGG() line 936 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/impls/gamg/agg.c
[0]PETSC ERROR: #5 PCSetUp_GAMG() line 595 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/impls/gamg/gamg.c
[0]PETSC ERROR: #6 PCSetUp() line 902 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #7 KSPSetUp() line 305 in 
/home/opt/cosmo_tica_lib/ompi1.8.1/petsc-master/src/ksp/ksp/interface/itfunc.c
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 63.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
-------------------------------------------------------------------------

Am 08.07.2014 17:14, schrieb Barry Smith:
    GAMG should not give a preallocation error regardless of the preallocation 
of the original matrix; so presumably you’ve hit on an error in GAMG. What 
version of PETSc are you using and what is the ENTIRE error message from GAMG?  
If you are using PETSc 3.4 please upgrade to 3.5 and see if GAMG still produces 
an error.

    Barry



On Jul 8, 2014, at 7:39 AM, Fabian.Jakub <[email protected]> 
wrote:

Hi,
I have a question regarding gamg where I get a wrong preallocation i.e. a  
MAT_NEW_NONZERO_ALLOCATION_ERR **.

I use a 3d DMDA with 10 dof but the coupling only ever needs 2 dof on a star 
stencil.

The code to setup the matrix is like this:

call DMSetMatrixPreallocateOnly(C%da, PETSC_TRUE,ierr)
call DMCreateMatrix(C%da, A, ierr)
call MatSetFromOptions(A,ierr)

call MatMPIAIJSetPreallocation(A, PETSC_NULL_INTEGER,d_nnz, PETSC_NULL_INTEGER, 
o_nnz, ierr)

insert matrix values & assemble

then solve.

If I solve the system with any ''normal'' KSP/PC there is no problem and matrix 
info actually confirms that the preallocation is good.
However if I use gamg I get an allocation error when it tries to create a 
coarse grid.

If I use the DMDA preallocation, it works but uses way more memory...

Is there a possibility to use custom preallocation and at the same time let 
gamg create the coarse grid?
Is that even the problem or am I missing something?

Thank you so very much.

Sincerely,
Fabian








Reply via email to