Something is still very wrong: Norm: 7.2163210348656361E-011
Norm: 1.7279069282940779E-005 For 1 process run with the option -ksp_view_mat binary -ksp_view_rhs binary -ksp_view_solution binary and send the file binaryoutput to [email protected] then for 4 processes run the same way and send the new file binaryoutput to [email protected] Barry > On Oct 1, 2016, at 12:08 PM, Manuel Valera <[email protected]> wrote: > > Interesting, it looks like it is an output issue, ksp_true_residual goes down > to 10^-11 in every case. Output attached. > > On Sat, Oct 1, 2016 at 9:56 AM, Barry Smith <[email protected]> wrote: > > This is not expected. > > Run on 1 and 4 processes with -ksp_monitor_true_residual > -ksp_converged_reason -ksp_view and send the output. > > Barry > > > On Oct 1, 2016, at 11:11 AM, Manuel Valera <[email protected]> wrote: > > > > I'm comparing with the ones vector as in many examples from petsc docs, so > > this may be because i hadn't set up the output to a single processor, but i > > get the following output for 1,2,4 processors: > > > > n=1 > > TrivSoln loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > Norm: 7.21632103486563610E-011 > > Its: 101 > > Total time: 5.0112988948822021 > > > > n=2 > > TrivSoln loaded, size: 213120 / 213120 > > TrivSoln loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > Norm: 1.09862436488003634E-007 > > Its: 101 > > Norm: 1.09862436488003634E-007 > > Its: 101 > > Total time: 2.9765341281890869 > > Total time: 2.9770300388336182 > > > > n=4 > > TrivSoln loaded, size: 213120 / 213120 > > TrivSoln loaded, size: 213120 / 213120 > > TrivSoln loaded, size: 213120 / 213120 > > TrivSoln loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > RHS loaded, size: 213120 / 213120 > > Norm: 1.72790692829407788E-005 > > Its: 101 > > Norm: 1.72790692829407788E-005 > > Its: 101 > > Norm: 1.72790692829407788E-005 > > Its: 101 > > Norm: 1.72790692829407788E-005 > > Its: 101 > > Total time: 1.8007240295410156 > > Total time: 1.8008360862731934 > > Total time: 1.8008909225463867 > > Total time: 1.8009200096130371 > > > > > > That is the error norm from the ones vector, im attaching the script again. > > > > > > On Sat, Oct 1, 2016 at 8:59 AM, Barry Smith <[email protected]> wrote: > > > > > On Sep 30, 2016, at 9:13 PM, Manuel Valera <[email protected]> wrote: > > > > > > Hi Barry and all, > > > > > > I was successful on creating the parallel version to solve my big system, > > > it is scaling accordingly, but i noticed the error norm increasing too, i > > > don't know if this is because the output is duplicated or if its really > > > increasing. Is this expected ? > > > > What do you mean by error norm? Do you have an exact solution you are > > comparing to? If so, you should scale the norm arising from this by > > 1/sqrt(nx*ny) where nx and ny are the number of grid points in the x and y > > direction. This scaling makes the norm correspond to the L2 norm of the > > error which is what you want to measure. > > > > With this new scaling you can do convergence studies, for example > > refine the grid once how much does the error norm reduce, refine the grid > > again and you should see a similar reduction in the error norm. > > > > > > Barry > > > > > > > > Thanks > > > > > > On Tue, Sep 27, 2016 at 4:07 PM, Barry Smith <[email protected]> wrote: > > > > > > Yes, always use the binary file > > > > > > > On Sep 27, 2016, at 3:13 PM, Manuel Valera <[email protected]> > > > > wrote: > > > > > > > > Barry, thanks for your insight, > > > > > > > > This standalone script must be translated into a much bigger model, > > > > which uses AIJ matrices to define the laplacian in the form of the 3 > > > > usual arrays, the ascii files in the script take the place of the > > > > arrays which are passed to the solving routine in the model. > > > > > > > > So, can i use the approach you mention to create the MPIAIJ from the > > > > petsc binary file ? would this be a better solution than reading the > > > > three arrays directly? In the model, even the smallest matrix is > > > > 10^5x10^5 elements > > > > > > > > Thanks. > > > > > > > > > > > > On Tue, Sep 27, 2016 at 12:53 PM, Barry Smith <[email protected]> > > > > wrote: > > > > > > > > Are you loading a matrix from an ASCII file? If so don't do that. You > > > > should write a simple sequential PETSc program that reads in the ASCII > > > > file and saves the matrix as a PETSc binary file with MatView(). Then > > > > write your parallel code that reads in the binary file with MatLoad() > > > > and solves the system. You can read in the right hand side from ASCII > > > > and save it in the binary file also. Trying to read an ASCII file in > > > > parallel and set it into a PETSc parallel matrix is just a totally > > > > thankless task that is unnecessary. > > > > > > > > Barry > > > > > > > > > On Sep 26, 2016, at 6:40 PM, Manuel Valera <[email protected]> > > > > > wrote: > > > > > > > > > > Ok, last output was from simulated multicores, in an actual cluster > > > > > the errors are of the kind: > > > > > > > > > > [valera@cinci CSRMatrix]$ petsc -n 2 ./solvelinearmgPETSc > > > > > TrivSoln loaded, size: 4 / 4 > > > > > TrivSoln loaded, size: 4 / 4 > > > > > RHS loaded, size: 4 / 4 > > > > > RHS loaded, size: 4 / 4 > > > > > [0]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [0]PETSC ERROR: Argument out of range > > > > > [0]PETSC ERROR: Comm must be of size 1 > > > > > [0]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [0]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [1]PETSC ERROR: Argument out of range > > > > > [1]PETSC ERROR: Comm must be of size 1 > > > > > [1]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [1]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [1]PETSC ERROR: #2 MatSetType() line 94 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c > > > > > [1]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > local size: 2 > > > > > local size: 2 > > > > > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran > > > > > --download-fblaslapack=1 --download-mpich > > > > > [0]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [0]PETSC ERROR: #2 MatSetType() line 94 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c > > > > > [0]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [0]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [1]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [1]PETSC ERROR: [0]PETSC ERROR: Nonconforming object sizes > > > > > [0]PETSC ERROR: Sum of local lengths 8 does not equal global length > > > > > 4, my local length 4 > > > > > likely a call to VecSetSizes() or MatSetSizes() is wrong. > > > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#split > > > > > [0]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > Nonconforming object sizes > > > > > [1]PETSC ERROR: Sum of local lengths 8 does not equal global length > > > > > 4, my local length 4 > > > > > likely a call to VecSetSizes() or MatSetSizes() is wrong. > > > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#split > > > > > [1]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [0]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [0]PETSC ERROR: #4 PetscSplitOwnership() line 93 in > > > > > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [1]PETSC ERROR: #4 PetscSplitOwnership() line 93 in > > > > > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c > > > > > [0]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in > > > > > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c > > > > > [0]PETSC ERROR: #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [1]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in > > > > > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c > > > > > [1]PETSC ERROR: [0]PETSC ERROR: #7 MatMPIAIJSetPreallocation() line > > > > > 3505 in /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [1]PETSC ERROR: [0]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > #7 MatMPIAIJSetPreallocation() line 3505 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [1]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [0]PETSC ERROR: #9 MatSetUp() line 739 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [1]PETSC ERROR: #9 MatSetUp() line 739 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [0]PETSC ERROR: Object is in wrong state > > > > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on > > > > > argument 1 "mat" before MatSetNearNullSpace() > > > > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [1]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [0]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > Object is in wrong state > > > > > [1]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on > > > > > argument 1 "mat" before MatSetNearNullSpace() > > > > > [1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cc=gcc > > > > > --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 > > > > > --download-mpich > > > > > [0]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [1]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [0]PETSC ERROR: Object is in wrong state > > > > > [1]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on > > > > > argument 1 "mat" before MatAssemblyBegin() > > > > > [0]PETSC ERROR: [1]PETSC ERROR: Object is in wrong state > > > > > [1]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on > > > > > argument 1 "mat" before MatAssemblyBegin() > > > > > [1]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran > > > > > --download-fblaslapack=1 --download-mpich > > > > > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: > > > > > ------------------------------------------------------------------------ > > > > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > > > > > probably memory access out of range > > > > > [1]PETSC ERROR: > > > > > ------------------------------------------------------------------------ > > > > > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > > > > > probably memory access out of range > > > > > [1]PETSC ERROR: [0]PETSC ERROR: Try option -start_in_debugger or > > > > > -on_error_attach_debugger > > > > > [0]PETSC ERROR: or see > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > > > > [0]PETSC ERROR: Try option -start_in_debugger or > > > > > -on_error_attach_debugger > > > > > [1]PETSC ERROR: or see > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > > > > [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac > > > > > OS X to find memory corruption errors > > > > > or try http://valgrind.org on GNU/linux and Apple Mac OS X to find > > > > > memory corruption errors > > > > > [0]PETSC ERROR: likely location of problem given in stack below > > > > > [0]PETSC ERROR: --------------------- Stack Frames > > > > > ------------------------------------ > > > > > [1]PETSC ERROR: likely location of problem given in stack below > > > > > [1]PETSC ERROR: --------------------- Stack Frames > > > > > ------------------------------------ > > > > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > > > > > available, > > > > > [0]PETSC ERROR: INSTEAD the line number of the start of the > > > > > function > > > > > [0]PETSC ERROR: [1]PETSC ERROR: Note: The EXACT line numbers in the > > > > > stack are not available, > > > > > [1]PETSC ERROR: INSTEAD the line number of the start of the > > > > > function > > > > > is given. > > > > > [0]PETSC ERROR: [0] MatAssemblyEnd line 5185 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: [1]PETSC ERROR: is given. > > > > > [1]PETSC ERROR: [1] MatAssemblyEnd line 5185 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0] MatAssemblyBegin line 5090 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: [0] MatSetNearNullSpace line 8191 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: [1]PETSC ERROR: [1] MatAssemblyBegin line 5090 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [1]PETSC ERROR: [0] PetscSplitOwnership line 80 > > > > > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c > > > > > [0]PETSC ERROR: [0] PetscLayoutSetUp line 129 > > > > > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c > > > > > [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation_MPIAIJ line 2767 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [1] MatSetNearNullSpace line 8191 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [1]PETSC ERROR: [1] PetscSplitOwnership line 80 > > > > > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c > > > > > [1]PETSC ERROR: [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation line > > > > > 3502 /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [0]PETSC ERROR: [0] MatSetUp_MPIAIJ line 2152 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [1] PetscLayoutSetUp line 129 > > > > > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c > > > > > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation_MPIAIJ line 2767 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [0]PETSC ERROR: [0] MatSetUp line 727 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [0]PETSC ERROR: [0] MatCreate_SeqAIJ line 3956 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation line 3502 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [1]PETSC ERROR: [1] MatSetUp_MPIAIJ line 2152 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c > > > > > [0]PETSC ERROR: [0] MatSetType line 44 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c > > > > > [0]PETSC ERROR: [0] MatCreateSeqAIJWithArrays line 4295 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [1]PETSC ERROR: [1] MatSetUp line 727 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c > > > > > [1]PETSC ERROR: [1] MatCreate_SeqAIJ line 3956 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [0]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [0]PETSC ERROR: Signal received > > > > > [1]PETSC ERROR: [1] MatSetType line 44 > > > > > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c > > > > > [1]PETSC ERROR: [1] MatCreateSeqAIJWithArrays line 4295 > > > > > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c > > > > > [0]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message > > > > > -------------------------------------------------------------- > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [0]PETSC ERROR: Signal received > > > > > [1]PETSC ERROR: See > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > shooting. > > > > > [1]PETSC ERROR: #12 User provided function() line 0 in unknown file > > > > > Petsc Release Version 3.7.2, Jun, 05, 2016 > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > P on a > > > > > arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016 > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich > > > > > [1]PETSC ERROR: #12 User provided function() line 0 in unknown file > > > > > application called MPI_Abort(comm=0x84000004, 59) - process 0 > > > > > [cli_0]: aborting job: > > > > > application called MPI_Abort(comm=0x84000004, 59) - process 0 > > > > > application called MPI_Abort(comm=0x84000002, 59) - process 1 > > > > > [cli_1]: aborting job: > > > > > application called MPI_Abort(comm=0x84000002, 59) - process 1 > > > > > > > > > > =================================================================================== > > > > > = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES > > > > > = PID 10266 RUNNING AT cinci > > > > > = EXIT CODE: 59 > > > > > = CLEANING UP REMAINING PROCESSES > > > > > = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES > > > > > =================================================================================== > > > > > > > > > > > > > > > On Mon, Sep 26, 2016 at 3:51 PM, Manuel Valera > > > > > <[email protected]> wrote: > > > > > Ok, i created a tiny testcase just for this, > > > > > > > > > > The output from n# calls are as follows: > > > > > > > > > > n1: > > > > > Mat Object: 1 MPI processes > > > > > type: mpiaij > > > > > row 0: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > row 1: (0, 2.) (1, 1.) (2, 3.) (3, 4.) > > > > > row 2: (0, 4.) (1, 3.) (2, 1.) (3, 2.) > > > > > row 3: (0, 3.) (1, 4.) (2, 2.) (3, 1.) > > > > > > > > > > n2: > > > > > Mat Object: 2 MPI processes > > > > > type: mpiaij > > > > > row 0: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > row 1: (0, 2.) (1, 1.) (2, 3.) (3, 4.) > > > > > row 2: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > row 3: (0, 2.) (1, 1.) (2, 3.) (3, 4.) > > > > > > > > > > n4: > > > > > Mat Object: 4 MPI processes > > > > > type: mpiaij > > > > > row 0: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > row 1: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > row 2: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > row 3: (0, 1.) (1, 2.) (2, 4.) (3, 3.) > > > > > > > > > > > > > > > > > > > > It really gets messed, no idea what's happening. > > > > > > > > > > > > > > > > > > > > > > > > > On Mon, Sep 26, 2016 at 3:12 PM, Barry Smith <[email protected]> > > > > > wrote: > > > > > > > > > > > On Sep 26, 2016, at 5:07 PM, Manuel Valera <[email protected]> > > > > > > wrote: > > > > > > > > > > > > Ok i was using a big matrix before, from a smaller testcase i got > > > > > > the output and effectively, it looks like is not well read at all, > > > > > > results are attached for DRAW viewer, output is too big to use > > > > > > STDOUT even in the small testcase. n# is the number of processors > > > > > > requested. > > > > > > > > > > You need to construct a very small test case so you can determine > > > > > why the values do not end up where you expect them. There is no way > > > > > around it. > > > > > > > > > > > > is there a way to create the matrix in one node and the distribute > > > > > > it as needed on the rest ? maybe that would work. > > > > > > > > > > No the is not scalable. You become limited by the memory of the > > > > > one node. > > > > > > > > > > > > > > > > > Thanks > > > > > > > > > > > > On Mon, Sep 26, 2016 at 2:40 PM, Barry Smith <[email protected]> > > > > > > wrote: > > > > > > > > > > > > How large is the matrix? It will take a very long time if the > > > > > > matrix is large. Debug with a very small matrix. > > > > > > > > > > > > Barry > > > > > > > > > > > > > On Sep 26, 2016, at 4:34 PM, Manuel Valera > > > > > > > <[email protected]> wrote: > > > > > > > > > > > > > > Indeed there is something wrong with that call, it hangs out > > > > > > > indefinitely showing only: > > > > > > > > > > > > > > Mat Object: 1 MPI processes > > > > > > > type: mpiaij > > > > > > > > > > > > > > It draws my attention that this program works for 1 processor but > > > > > > > not more, but it doesnt show anything for that viewer in either > > > > > > > case. > > > > > > > > > > > > > > Thanks for the insight on the redundant calls, this is not very > > > > > > > clear on documentation, which calls are included in others. > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <[email protected]> > > > > > > > wrote: > > > > > > > > > > > > > > The call to MatCreateMPIAIJWithArrays() is likely interpreting > > > > > > > the values you pass in different than you expect. > > > > > > > > > > > > > > Put a call to MatView(Ap,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > > after the MatCreateMPIAIJWithArray() to see what PETSc thinks the > > > > > > > matrix is. > > > > > > > > > > > > > > > > > > > > > > On Sep 26, 2016, at 3:42 PM, Manuel Valera > > > > > > > > <[email protected]> wrote: > > > > > > > > > > > > > > > > Hello, > > > > > > > > > > > > > > > > I'm working on solve a linear system in parallel, following > > > > > > > > ex12 of the ksp tutorial i don't see major complication on > > > > > > > > doing so, so for a working linear system solver with PCJACOBI > > > > > > > > and KSPGCR i did only the following changes: > > > > > > > > > > > > > > > > call MatCreate(PETSC_COMM_WORLD,Ap,ierr) > > > > > > > > ! call MatSetType(Ap,MATSEQAIJ,ierr) > > > > > > > > call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization > > > > > > > > > > > > > > > > call MatSetSizes(Ap,PETSC_DECIDE,PETSC_DECIDE,nbdp,nbdp,ierr); > > > > > > > > > > > > > > > > ! call MatSeqAIJSetPreallocationCSR(Ap,iapi,japi,app,ierr) > > > > > > > > call MatSetFromOptions(Ap,ierr) > > > > > > > > > > > > > > Note that none of the lines above are needed (or do anything) > > > > > > > because the MatCreateMPIAIJWithArrays() creates the matrix from > > > > > > > scratch itself. > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > ! call > > > > > > > > MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD,nbdp,nbdp,iapi,japi,app,Ap,ierr) > > > > > > > > call > > > > > > > > MatCreateMPIAIJWithArrays(PETSC_COMM_WORLD,floor(real(nbdp)/sizel),PETSC_DECIDE,nbdp,nbdp,iapi,japi,app,Ap,ierr) > > > > > > > > > > > > > > > > > > > > > > > > I grayed out the changes from sequential implementation. > > > > > > > > > > > > > > > > So, it does not complain at runtime until it reaches > > > > > > > > KSPSolve(), with the following error: > > > > > > > > > > > > > > > > > > > > > > > > [1]PETSC ERROR: --------------------- Error Message > > > > > > > > -------------------------------------------------------------- > > > > > > > > [1]PETSC ERROR: Object is in wrong state > > > > > > > > [1]PETSC ERROR: Matrix is missing diagonal entry 0 > > > > > > > > [1]PETSC ERROR: See > > > > > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble > > > > > > > > shooting. > > > > > > > > [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown > > > > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > � � on a arch-linux2-c-debug named > > > > > > > > valera-HP-xw4600-Workstation by valera Mon Sep 26 13:35:15 2016 > > > > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > > > > > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1 > > > > > > > > --download-ml=1 > > > > > > > > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/mat/impls/aij/seq/aijfact.c > > > > > > > > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/mat/interface/matrix.c > > > > > > > > [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/factor/ilu/ilu.c > > > > > > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c > > > > > > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c > > > > > > > > [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_Singleblock() line > > > > > > > > 650 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c > > > > > > > > [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c > > > > > > > > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c > > > > > > > > [1]PETSC ERROR: #9 KSPSolve() line 600 in > > > > > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c > > > > > > > > At line 333 of file solvelinearmgPETSc.f90 > > > > > > > > Fortran runtime error: Array bound mismatch for dimension 1 of > > > > > > > > array 'sol' (213120/106560) > > > > > > > > > > > > > > > > > > > > > > > > This code works for -n 1 cores, but it gives this error when > > > > > > > > using more than one core. > > > > > > > > > > > > > > > > What am i missing? > > > > > > > > > > > > > > > > Regards, > > > > > > > > > > > > > > > > Manuel. > > > > > > > > > > > > > > > > <solvelinearmgPETSc.f90> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > <n4.png><n2.png><n1.png> > > > > > > > > > > > > > > > > > > > > <rhss.txt><solvelinearmgPETSc.f90><as.txt><ias.txt><jas.txt> > > > > > > > > > > > > > > > > > > > > <solvelinearmgPETSc.f90> > > > <kspmonitor4.txt><kspmonitor2.txt><kspmonitor1.txt>
