I'm comparing with the ones vector as in many examples from petsc docs, so
this may be because i hadn't set up the output to a single processor, but i
get the following output for 1,2,4 processors:

n=1
TrivSoln loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
* Norm:  7.21632103486563610E-011*
 Its:         101
 Total time:   5.0112988948822021

n=2
 TrivSoln loaded, size:       213120 /      213120
 TrivSoln loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
* Norm:  1.09862436488003634E-007*
 Its:         101
 Norm:  1.09862436488003634E-007
 Its:         101
 Total time:   2.9765341281890869
 Total time:   2.9770300388336182

n=4
 TrivSoln loaded, size:       213120 /      213120
 TrivSoln loaded, size:       213120 /      213120
 TrivSoln loaded, size:       213120 /      213120
 TrivSoln loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
 RHS loaded, size:       213120 /      213120
* Norm:  1.72790692829407788E-005*
 Its:         101
 Norm:  1.72790692829407788E-005
 Its:         101
 Norm:  1.72790692829407788E-005
 Its:         101
 Norm:  1.72790692829407788E-005
 Its:         101
 Total time:   1.8007240295410156
 Total time:   1.8008360862731934
 Total time:   1.8008909225463867
 Total time:   1.8009200096130371


That is the error norm from the ones vector, im attaching the script again.


On Sat, Oct 1, 2016 at 8:59 AM, Barry Smith <[email protected]> wrote:

>
> > On Sep 30, 2016, at 9:13 PM, Manuel Valera <[email protected]>
> wrote:
> >
> > Hi Barry and all,
> >
> > I was successful on creating the parallel version to solve my big
> system, it is scaling accordingly, but i noticed the error norm increasing
> too, i don't know if this is because the output is duplicated or if its
> really increasing. Is this expected ?
>
>    What do you mean by error norm? Do you have an exact solution you are
> comparing to? If so, you should scale the norm arising from this by
> 1/sqrt(nx*ny) where nx and ny are the number of grid points in the x and y
> direction. This scaling makes the norm correspond to the L2 norm of the
> error which is what you want to measure.
>
>     With this new scaling you can do convergence studies, for example
> refine the grid once how much does the error norm reduce, refine the grid
> again and you should see a similar reduction in the error norm.
>
>
>   Barry
>
> >
> > Thanks
> >
> > On Tue, Sep 27, 2016 at 4:07 PM, Barry Smith <[email protected]> wrote:
> >
> >    Yes, always use the binary file
> >
> > > On Sep 27, 2016, at 3:13 PM, Manuel Valera <[email protected]>
> wrote:
> > >
> > > Barry, thanks for your insight,
> > >
> > > This standalone script must be translated into a much bigger model,
> which uses AIJ matrices to define the laplacian in the form of the 3 usual
> arrays, the ascii files in the script take the place of the arrays which
> are passed to the solving routine in the model.
> > >
> > > So, can i use the approach you mention to create the MPIAIJ from the
> petsc binary file ? would this be a better solution than reading the three
> arrays directly? In the model, even the smallest matrix is 10^5x10^5
> elements
> > >
> > > Thanks.
> > >
> > >
> > > On Tue, Sep 27, 2016 at 12:53 PM, Barry Smith <[email protected]>
> wrote:
> > >
> > >   Are you loading a matrix from an ASCII file? If so don't do that.
> You should write a simple sequential PETSc program that reads in the ASCII
> file and saves the matrix as a PETSc binary file with MatView(). Then write
> your parallel code that reads in the binary file with MatLoad() and solves
> the system. You can read in the right hand side from ASCII and save it in
> the binary file also. Trying to read an ASCII file in parallel and set it
> into a PETSc parallel matrix is just a totally thankless task that is
> unnecessary.
> > >
> > >    Barry
> > >
> > > > On Sep 26, 2016, at 6:40 PM, Manuel Valera <[email protected]>
> wrote:
> > > >
> > > > Ok, last output was from simulated multicores, in an actual cluster
> the errors are of the kind:
> > > >
> > > > [valera@cinci CSRMatrix]$ petsc -n 2 ./solvelinearmgPETSc
> > > >  TrivSoln loaded, size:            4 /           4
> > > >  TrivSoln loaded, size:            4 /           4
> > > >  RHS loaded, size:            4 /           4
> > > >  RHS loaded, size:            4 /           4
> > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [0]PETSC ERROR: Argument out of range
> > > > [0]PETSC ERROR: Comm must be of size 1
> > > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [0]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [1]PETSC ERROR: Argument out of range
> > > > [1]PETSC ERROR: Comm must be of size 1
> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [1]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [1]PETSC ERROR: #2 MatSetType() line 94 in
> /home/valera/petsc-3.7.2/src/mat/interface/matreg.c
> > > > [1]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > >  local size:           2
> > > >  local size:           2
> > > > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
> --download-fblaslapack=1 --download-mpich
> > > > [0]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [0]PETSC ERROR: #2 MatSetType() line 94 in
> /home/valera/petsc-3.7.2/src/mat/interface/matreg.c
> > > > [0]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [1]PETSC ERROR: [0]PETSC ERROR: Nonconforming object sizes
> > > > [0]PETSC ERROR: Sum of local lengths 8 does not equal global length
> 4, my local length 4
> > > >   likely a call to VecSetSizes() or MatSetSizes() is wrong.
> > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
> > > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > Nonconforming object sizes
> > > > [1]PETSC ERROR: Sum of local lengths 8 does not equal global length
> 4, my local length 4
> > > >   likely a call to VecSetSizes() or MatSetSizes() is wrong.
> > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [0]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [0]PETSC ERROR: #4 PetscSplitOwnership() line 93 in
> /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [1]PETSC ERROR: #4 PetscSplitOwnership() line 93 in
> /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > > > [0]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in
> /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > > > [0]PETSC ERROR: #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [1]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in
> /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > > > [1]PETSC ERROR: [0]PETSC ERROR: #7 MatMPIAIJSetPreallocation() line
> 3505 in /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [1]PETSC ERROR: [0]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > #7 MatMPIAIJSetPreallocation() line 3505 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [1]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [0]PETSC ERROR: #9 MatSetUp() line 739 in
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [1]PETSC ERROR: #9 MatSetUp() line 739 in
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [0]PETSC ERROR: Object is in wrong state
> > > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
> argument 1 "mat" before MatSetNearNullSpace()
> > > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [0]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > Object is in wrong state
> > > > [1]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
> argument 1 "mat" before MatSetNearNullSpace()
> > > > [1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cc=gcc
> --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [0]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> shooting.
> > > > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [1]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [0]PETSC ERROR: Object is in wrong state
> > > > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
> argument 1 "mat" before MatAssemblyBegin()
> > > > [0]PETSC ERROR: [1]PETSC ERROR: Object is in wrong state
> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on
> argument 1 "mat" before MatAssemblyBegin()
> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
> --download-fblaslapack=1 --download-mpich
> > > > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [0]PETSC ERROR: ------------------------------
> ------------------------------------------
> > > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> Violation, probably memory access out of range
> > > > [1]PETSC ERROR: ------------------------------
> ------------------------------------------
> > > > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
> Violation, probably memory access out of range
> > > > [1]PETSC ERROR: [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> > > > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/
> documentation/faq.html#valgrind
> > > > [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> > > > [1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/
> documentation/faq.html#valgrind
> > > > [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple
> Mac OS X to find memory corruption errors
> > > > or try http://valgrind.org on GNU/linux and Apple Mac OS X to find
> memory corruption errors
> > > > [0]PETSC ERROR: likely location of problem given in stack below
> > > > [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> > > > [1]PETSC ERROR: likely location of problem given in stack below
> > > > [1]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> > > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> > > > [0]PETSC ERROR:       INSTEAD the line number of the start of the
> function
> > > > [0]PETSC ERROR: [1]PETSC ERROR: Note: The EXACT line numbers in the
> stack are not available,
> > > > [1]PETSC ERROR:       INSTEAD the line number of the start of the
> function
> > > >       is given.
> > > > [0]PETSC ERROR: [0] MatAssemblyEnd line 5185
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [0]PETSC ERROR: [1]PETSC ERROR:       is given.
> > > > [1]PETSC ERROR: [1] MatAssemblyEnd line 5185
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [0] MatAssemblyBegin line 5090 /home/valera/petsc-3.7.2/src/
> mat/interface/matrix.c
> > > > [0]PETSC ERROR: [0] MatSetNearNullSpace line 8191
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [0]PETSC ERROR: [1]PETSC ERROR: [1] MatAssemblyBegin line 5090
> /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > > > [1]PETSC ERROR: [0] PetscSplitOwnership line 80
> /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > > > [0]PETSC ERROR: [0] PetscLayoutSetUp line 129
> /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > > > [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation_MPIAIJ line 2767
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [1] MatSetNearNullSpace line 8191 /home/valera/petsc-3.7.2/src/
> mat/interface/matrix.c
> > > > [1]PETSC ERROR: [1] PetscSplitOwnership line 80
> /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > > > [1]PETSC ERROR: [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation line
> 3502 /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [0]PETSC ERROR: [0] MatSetUp_MPIAIJ line 2152
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [1] PetscLayoutSetUp line 129 /home/valera/petsc-3.7.2/src/
> vec/is/utils/pmap.c
> > > > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation_MPIAIJ line 2767
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [0]PETSC ERROR: [0] MatSetUp line 727 /home/valera/petsc-3.7.2/src/
> mat/interface/matrix.c
> > > > [0]PETSC ERROR: [0] MatCreate_SeqAIJ line 3956
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation line 3502
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [1]PETSC ERROR: [1] MatSetUp_MPIAIJ line 2152
> /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > > > [0]PETSC ERROR: [0] MatSetType line 44 /home/valera/petsc-3.7.2/src/
> mat/interface/matreg.c
> > > > [0]PETSC ERROR: [0] MatCreateSeqAIJWithArrays line 4295
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [1]PETSC ERROR: [1] MatSetUp line 727 /home/valera/petsc-3.7.2/src/
> mat/interface/matrix.c
> > > > [1]PETSC ERROR: [1] MatCreate_SeqAIJ line 3956
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [0]PETSC ERROR: Signal received
> > > > [1]PETSC ERROR: [1] MatSetType line 44 /home/valera/petsc-3.7.2/src/
> mat/interface/matreg.c
> > > > [1]PETSC ERROR: [1] MatCreateSeqAIJWithArrays line 4295
> /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [0]PETSC ERROR: Signal received
> > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > [1]PETSC ERROR: #12 User provided function() line 0 in  unknown file
> > > > Petsc Release Version 3.7.2, Jun, 05, 2016
> > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                     P on a
> arch-linux2-c-debug named cinci by valera Mon Sep 26 16:39:02 2016
> > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > > > [1]PETSC ERROR: #12 User provided function() line 0 in  unknown file
> > > > application called MPI_Abort(comm=0x84000004, 59) - process 0
> > > > [cli_0]: aborting job:
> > > > application called MPI_Abort(comm=0x84000004, 59) - process 0
> > > > application called MPI_Abort(comm=0x84000002, 59) - process 1
> > > > [cli_1]: aborting job:
> > > > application called MPI_Abort(comm=0x84000002, 59) - process 1
> > > >
> > > > ============================================================
> =======================
> > > > =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> > > > =   PID 10266 RUNNING AT cinci
> > > > =   EXIT CODE: 59
> > > > =   CLEANING UP REMAINING PROCESSES
> > > > =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> > > > ============================================================
> =======================
> > > >
> > > >
> > > > On Mon, Sep 26, 2016 at 3:51 PM, Manuel Valera <
> [email protected]> wrote:
> > > > Ok, i created a tiny testcase just for this,
> > > >
> > > > The output from n# calls are as follows:
> > > >
> > > > n1:
> > > > Mat Object: 1 MPI processes
> > > >   type: mpiaij
> > > > row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > > row 1: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> > > > row 2: (0, 4.)  (1, 3.)  (2, 1.)  (3, 2.)
> > > > row 3: (0, 3.)  (1, 4.)  (2, 2.)  (3, 1.)
> > > >
> > > > n2:
> > > > Mat Object: 2 MPI processes
> > > >   type: mpiaij
> > > > row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > > row 1: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> > > > row 2: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > > row 3: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> > > >
> > > > n4:
> > > > Mat Object: 4 MPI processes
> > > >   type: mpiaij
> > > > row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > > row 1: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > > row 2: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > > row 3: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > > >
> > > >
> > > >
> > > > It really gets messed, no idea what's happening.
> > > >
> > > >
> > > >
> > > >
> > > > On Mon, Sep 26, 2016 at 3:12 PM, Barry Smith <[email protected]>
> wrote:
> > > >
> > > > > On Sep 26, 2016, at 5:07 PM, Manuel Valera <[email protected]>
> wrote:
> > > > >
> > > > > Ok i was using a big matrix before, from a smaller testcase i got
> the output and effectively, it looks like is not well read at all, results
> are attached for DRAW viewer, output is too big to use STDOUT even in the
> small testcase. n# is the number of processors requested.
> > > >
> > > >    You need to construct a very small test case so you can determine
> why the values do not end up where you expect them. There is no way around
> it.
> > > > >
> > > > > is there a way to create the matrix in one node and the distribute
> it as needed on the rest ? maybe that would work.
> > > >
> > > >    No the is not scalable. You become limited by the memory of the
> one node.
> > > >
> > > > >
> > > > > Thanks
> > > > >
> > > > > On Mon, Sep 26, 2016 at 2:40 PM, Barry Smith <[email protected]>
> wrote:
> > > > >
> > > > >     How large is the matrix? It will take a very long time if the
> matrix is large. Debug with a very small matrix.
> > > > >
> > > > >   Barry
> > > > >
> > > > > > On Sep 26, 2016, at 4:34 PM, Manuel Valera <
> [email protected]> wrote:
> > > > > >
> > > > > > Indeed there is something wrong with that call, it hangs out
> indefinitely showing only:
> > > > > >
> > > > > >  Mat Object: 1 MPI processes
> > > > > >   type: mpiaij
> > > > > >
> > > > > > It draws my attention that this program works for 1 processor
> but not more, but it doesnt show anything for that viewer in either case.
> > > > > >
> > > > > > Thanks for the insight on the redundant calls, this is not very
> clear on documentation, which calls are included in others.
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <[email protected]>
> wrote:
> > > > > >
> > > > > >    The call to MatCreateMPIAIJWithArrays() is likely
> interpreting the values you pass in different than you expect.
> > > > > >
> > > > > >     Put a call to MatView(Ap,PETSC_VIEWER_STDOUT_WORLD,ierr)
> after the MatCreateMPIAIJWithArray() to see what PETSc thinks the matrix is.
> > > > > >
> > > > > >
> > > > > > > On Sep 26, 2016, at 3:42 PM, Manuel Valera <
> [email protected]> wrote:
> > > > > > >
> > > > > > > Hello,
> > > > > > >
> > > > > > > I'm working on solve a linear system in parallel, following
> ex12 of the ksp tutorial i don't see major complication on doing so, so for
> a working linear system solver with PCJACOBI and KSPGCR i did only the
> following changes:
> > > > > > >
> > > > > > >    call MatCreate(PETSC_COMM_WORLD,Ap,ierr)
> > > > > > > !  call MatSetType(Ap,MATSEQAIJ,ierr)
> > > > > > >   call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization
> > > > > > >
> > > > > > >   call MatSetSizes(Ap,PETSC_DECIDE,
> PETSC_DECIDE,nbdp,nbdp,ierr);
> > > > > > >
> > > > > > > !  call MatSeqAIJSetPreallocationCSR(Ap,iapi,japi,app,ierr)
> > > > > > >   call MatSetFromOptions(Ap,ierr)
> > > > > >
> > > > > >     Note that none of the lines above are needed (or do
> anything) because the MatCreateMPIAIJWithArrays() creates the matrix from
> scratch itself.
> > > > > >
> > > > > >    Barry
> > > > > >
> > > > > > > !  call MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD,nbdp,nbdp,
> iapi,japi,app,Ap,ierr)
> > > > > > >  call MatCreateMPIAIJWithArrays(PETSC_COMM_WORLD,floor(real(
> nbdp)/sizel),PETSC_DECIDE,nbdp,nbdp,iapi,japi,app,Ap,ierr)
> > > > > > >
> > > > > > >
> > > > > > > I grayed out the changes from sequential implementation.
> > > > > > >
> > > > > > > So, it does not complain at runtime until it reaches
> KSPSolve(), with the following error:
> > > > > > >
> > > > > > >
> > > > > > > [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > > > > > [1]PETSC ERROR: Object is in wrong state
> > > > > > > [1]PETSC ERROR: Matrix is missing diagonal entry 0
> > > > > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > > > > > [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown
> > > > > > > [1]PETSC ERROR: ./solvelinearmgPETSc
>
>
>                                                           � � on a
> arch-linux2-c-debug named valera-HP-xw4600-Workstation by valera Mon Sep 26
> 13:35:15 2016
> > > > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1
> --download-ml=1
> > > > > > > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in
> /home/valera/v5PETSc/petsc/petsc/src/mat/impls/aij/seq/aijfact.c
> > > > > > > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in
> /home/valera/v5PETSc/petsc/petsc/src/mat/interface/matrix.c
> > > > > > > [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/factor/ilu/ilu.c
> > > > > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
> > > > > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > > > > > > [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_Singleblock() line
> 650 in /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
> > > > > > > [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
> > > > > > > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > > > > > > [1]PETSC ERROR: #9 KSPSolve() line 600 in
> /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > > > > > > At line 333 of file solvelinearmgPETSc.f90
> > > > > > > Fortran runtime error: Array bound mismatch for dimension 1 of
> array 'sol' (213120/106560)
> > > > > > >
> > > > > > >
> > > > > > > This code works for -n 1 cores, but it gives this error when
> using more than one core.
> > > > > > >
> > > > > > > What am i missing?
> > > > > > >
> > > > > > > Regards,
> > > > > > >
> > > > > > > Manuel.
> > > > > > >
> > > > > > > <solvelinearmgPETSc.f90>
> > > > > >
> > > > > >
> > > > >
> > > > >
> > > > > <n4.png><n2.png><n1.png>
> > > >
> > > >
> > > >
> > > > <rhss.txt><solvelinearmgPETSc.f90><as.txt><ias.txt><jas.txt>
> > >
> > >
> >
> >
>
>
PROGRAM solvelinearmgpetsc

IMPLICIT NONE

!PETSc Implementation, include files:
!Created by Manuel Valera 6/10/16:

#include <petsc/finclude/petscsys.h>    !core PETSc
#include <petsc/finclude/petscvec.h>    !vectors
#include <petsc/finclude/petscvec.h90>  !special vectors f90 functions
#include <petsc/finclude/petscmat.h>    !matrixes
#include <petsc/finclude/petscis.h>     !index sets
#include <petsc/finclude/petscviewer.h> !viewers
#include <petsc/finclude/petscpc.h>     !preconditioners
#include <petsc/finclude/petscksp.h>    !Krylov subspaces (Solver)


! Local Variables: matrix dimensions for Seamount Testcase (97x61x38)
INTEGER, parameter					:: nbnodos = 213120, nnz = 3934732
! Matrix dimension for Angie Testing case
!INTEGER, parameter					:: nbnodos = 36, nnz = 394
! Tiny testcase 3x3x2
!INTEGER, parameter					:: nbnodos = 4, nnz = 16
DOUBLE PRECISION                                        :: ui,uj,uk
DOUBLE PRECISION                                        :: vi,vj,vk
DOUBLE PRECISION                                        :: wi,wj,wk
DOUBLE PRECISION                                        :: const,maxdiff
DOUBLE PRECISION, DIMENSION(nbnodos)                    :: Rhs
DOUBLE PRECISION, DIMENSION(nbnodos)                    :: sol,sol2,soldiff
INTEGER                                                 :: i,j,k,l
real (kind(0d0))                                        :: tolp,norm2
integer                                                 :: iter,iprint
INTEGER                                                 :: ierr0,ierrm


! PETSc IMPLEMENTATION, Variables:

        PC                 :: pc,mg
        KSP                :: ksp
        Vec                :: xp,bp,up,x,y,work
        Mat                :: Ap,pmat
        MatNullSpace       :: nullsp
        PetscErrorCode     :: ierr
        PetscInt           :: nbdp,nnzp,ii,nel,its,spa
        PetscReal     	   :: norm,tol
	PetscScalar	   :: negone,one
	PetscScalar, pointer :: soln(:)
	PetscScalar, pointer :: rpia(:) 
!	PetscInt, allocatable :: iapi(:),japi(:),ind(:)
	PetscInt, allocatable :: ind(:)
!	PetscScalar, allocatable :: app(:)
        PetscScalar        :: rpi
	PetscViewer	   :: viewer
        KSPConvergedReason :: reason
        PetscMPIInt        :: rankl, sizel
	PetscLogDouble	   :: ti,tf,tt
	

!TODO: IMPORTANT ALL USER-DEFINED ROUTINES SHOULD BE DECLARED AS EXTERNAL

	!external SampleShellPCSetUp,SampleShellPCApply
	
	!common /mypcs/ jacobi,mg,work
	!PC jacobi,mg
	!Vec work

!Testing AGMG reading from a file.
DOUBLE PRECISION,allocatable :: a(:)
integer,allocatable :: ja(:),ia(:)

allocate(a(nnz), stat=ierr)
allocate(ja(nnz), stat=ierr)
allocate(ia(nbnodos+1), stat=ierr)


    open(15,file='iac.txt', status='old')
!    open(15,file='row_ptr.dat', status='old') !smalltestcase
!    open(15,file='ias.txt', status='old') !tinytestcase
    read(15,*) ia
    close(15,status='KEEP')

    open(16,file='jac.txt', status='old')
!    open(16,file='Col_id.dat', status='old') !smalltestcase
!    open(16,file='jas.txt', status='old') !tinytestcase
    read(16,*) ja
    close(16,status='KEEP')

    open(17,file='ac.txt', status='old')
!    open(17,file='Acsr.dat', status='old') !smalltestcase
!    open(17,file='as.txt', status='old') !tinytestcase
    read(17,*) a
    close(17,status='KEEP')

    open(18,file='rhsc.txt', status='old')
!    open(18,file='rhs.dat', status='old')   !smalltestcase
!    open(18,file='rhss.txt', status='old')   !tinytestcase
    read(18,*) Rhs
    close(18,status='KEEP')


!PETSc IMPLEMENTATION, MAT-VEC Population:
!by Manuel Valera, June 2016.

  call MPI_Init(ierrm)
  call MPI_Comm_Rank(MPI_COMM_WORLD,rankl,ierrm)
  call MPI_Comm_split(MPI_COMM_WORLD,1,0,PETSC_COMM_WORLD,ierrm)


  call PetscInitialize(PETSC_NULL_CHARACTER,ierr)

  call PetscTime(ti,ierr)

  call MPI_Comm_rank(PETSC_COMM_WORLD, rankl, ierr)
  call MPI_Comm_size(PETSC_COMM_WORLD, sizel, ierr)


!Adapted from Chris Paolini examples:

 ! 1.- SETUP VECTORS-
 !    Create vectors.  Note that we form 1 vector from scratch and
 !   then duplicate to form b.

 ! Prepare PETSc integers:

  nbdp = nbnodos  !matrix dimension
  nnzp = nnz      !nonzero entries in each matrix row

!  allocate(iapi(nbdp+1), stat=ierr)
!  allocate(japi(nnzp), stat=ierr)
!  allocate(app(nnzp), stat=ierr)
  allocate(ind(nnzp), stat=ierr)

  allocate(soln(nbdp), stat=ierr)

  call VecCreate(PETSC_COMM_WORLD,xp,ierr)
  call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr)   ! Size of xp - nbnodos
      !VecSetSizes(Vec v, PetscInt n, PetscInt N)
  call VecSetFromOptions(xp,ierr)
  call VecDuplicate(xp,bp,ierr)             ! duplicate xp (soln vector) into bp(RHS),both have same length
  call VecDuplicate(xp,up,ierr)    	    ! up = exact solution
  call VecDuplicate(xp,work,ierr)    	    ! some other vector
  call VecDuplicate(xp,y,ierr)    	    ! some other vector
  CHKERRQ(ierr)

 ! 1.2.- SET VECTORS VALUES:

  call VecSet(xp,0.0D0,ierr) !initializing SOLN to zeros
  call VecAssemblyBegin(xp,ierr) ; call VecAssemblyEnd(xp,ierr) 

  call VecSet(up,1.0D0,ierr) !initializing exact soln to ones
  call VecAssemblyBegin(up,ierr) ; call VecAssemblyEnd(up,ierr) 

  call VecAssemblyBegin(work,ierr) ; call VecAssemblyEnd(work,ierr)
  call VecAssemblyBegin(y,ierr) ; call VecAssemblyEnd(y,ierr)  

  call VecGetSize(xp,nel,ierr)
  print*, "TrivSoln loaded, size: ", nel, "/",nbdp



!  call VecGetArrayF90(xp,soln,ierr)
!  sol = soln             
!  call VecRestoreArrayF90(xp,soln,ierr)

!  print*, "XP EYE maxval:", MAXVAL(sol) 			
!  print*, "XP EYE minval:", MINVAL(sol) 	


 ! Load RHS:

!  call VecSet(bp,5.0D0,ierr) !making Rhs something else (all 5's)

  do i=0,nbdp-1,1
  	ind(i+1) = i
  enddo

  call VecSetValues(bp,nbdp,ind,Rhs,INSERT_VALUES,ierr) !feeding whole RHS in one call

  call VecAssemblyBegin(bp,ierr) ; call VecAssemblyEnd(bp,ierr) !rhs

!  call VecView(bp,PETSC_VIEWER_STDOUT_WORLD,ierr) ! RHS looks OK
  call VecGetSize(bp,nel,ierr)
  print*, "RHS loaded, size: ", nel, "/",nbdp
  
!  allocate(rpia(nbdp),stat=ierr)

!  call VecGetArrayF90(bp,rpia,ierr)
!  print*, "RHS maxvalue:", MAXVAL(rpia)    !a way to look if rhs is changing            
!  call VecRestoreArrayF90(bp,rpia,ierr)
 
!  deallocate(rpia,stat=ierr)


!   print*, ia
!   print*, ja
!   print*, a

!   call loadsavematPETSc(ia,ja,a,nnz,nbnodos)

   call MatCreate(PETSC_COMM_WORLD,Ap,ierr)
   call PetscViewerBinaryOpen(PETSC_COMM_WORLD,"AP.petsc",FILE_MODE_READ,viewer,ierr)
   call MatLoad(Ap,viewer,ierr)
 


!Matrix viewer (off diagonals not shown ?)

!  call MatView(Ap,PETSC_VIEWER_STDOUT_WORLD,ierr)  !prints everything to stdout
!  call MatView(Ap,PETSC_VIEWER_DRAW_WORLD,ierr)    !shows image
!  call sleep(1)


!  call exit()
!!!! Creating A*eye=RHS

  call MatMult(Ap,up,bp,ierr)    !Ap*up=bp
!!!

 ! 4.- PETSc IMPLEMENTATION, ACTUAL SOLVER:


 !/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
 !    Create the linear solver and set various options
 !    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
 ! /*
 !   Create linear solver context
 ! */

  call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) ! Solver (linear, for nonlinear must use SNES)
  call KSPSetOperators(ksp,Ap,Ap,ierr)   
  call KSPGetPC(ksp,pc,ierr)
  tol = 1.e-5
  call KSPSetTolerances(ksp,tol,PETSC_DEFAULT_REAL,PETSC_DEFAULT_REAL,PETSC_DEFAULT_INTEGER,ierr)

        !Create PC
  call PCGetOperators(pc,Ap,pmat,ierr)
  call PCCreate(PETSC_COMM_WORLD,mg,ierr)
  call PCSetType(mg,PCJACOBI,ierr)
  call PCSetOperators(mg,Ap,pmat,ierr)
  call PCSetUp(mg,ierr)


!  call PCGetOperators(pc,PETSC_NULL_OBJECT,pmat,ierr)
!  call PCCreate(PETSC_COMM_WORLD,mg,ierr)
!  call PCSetType(mg,PCGAMG,ierr)
!  call PCSetOperators(mg,pmat,pmat,ierr)
!  call PCSetUp(mg,ierr)


	!MG options
!  call PCGAMGSetNLevels(mg,3,PETSC_NULL_OBJECT,ierr)
!  call PCGAMGSetNSmooths(mg,1,ierr)
!  call PCGAMGSetThreshold(mg,0.0d0,ierr)


	!Apply PC
  call PCApply(mg,xp,work,ierr)
  one = 1.0
  call VecAXPY(y,one,work,ierr)


!!Calling PCs as example42KSP:
!  call PCSetType(pc,PCSHELL,ierr)  	!PC setted as ex42 of ksp
!  call PCSetType(pc,PCJACOBI,ierr)  	!
!  call PCShellSetApply(pc,SampleShellPCApply,ierr)
!  call SampleShellPCSetUp(pc,xp,ierr)



  call KSPSetType(ksp,KSPGCR,ierr)     !agmg uses this, basically.
  call KSPSetCheckNormIteration(ksp,100,ierr)  !agmg has 5000


  CHKERRQ(ierr)

  call KSPSetFromOptions(ksp,ierr)

  CHKERRQ(ierr)  

  call KSPSetUp(ksp,ierr);		
!  call KSPSetUpOnBlocks(ksp,ierr);
 
  CHKERRQ(ierr)

  !/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
  !   Solve the linear system
  ! - - - - - - - - - - - - - - - - - - - - - - - */

  call KSPSolve(ksp,bp,xp,ierr)

!  call KSPMonitorTrueResidualNorm(ksp,its,norm,viewer,ierr)

!  call VecGetArrayF90(xp,soln,ierr)
!  sol = soln             !Copying new solution to regular scalar array
!  call VecRestoreArrayF90(xp,soln,ierr)

!  print*, "soln maxval:", MAXVAL(sol) 			
!  print*, "soln minval:", MINVAL(sol) 			

!Check the error as in ex1.c from petsc manual:

  negone = -1.0d0

  call VecAXPY(xp,negone,up,ierr);
  call VecNorm(xp,NORM_2,norm,ierr);
  call KSPGetIterationNumber(ksp,its,ierr);
  iter = its
  norm2 = norm
  print *, "Norm:", norm2 
  print *, "Its:", iter
  



!End checking error

!  deallocate(iapi,stat=ierr0) 
!  IF (ierr0 /= 0) STOP "*** iapi ***"
!  deallocate(japi,stat=ierr0)
!  IF (ierr0 /= 0) STOP "*** japi ***"

  ! 5.- CLEAN UP OBJECTS:

  call MatDestroy(Ap,ierr)  
  call VecDestroy(xp,ierr)
  call VecDestroy(bp,ierr)
  call VecDestroy(up,ierr)
  call KSPDestroy(ksp,ierr)
  call PCDestroy(mg,ierr)
!  call PCDestroy(pc,ierr)   !this gives error - weird
  call VecDestroy(work,ierr)
  call VecDestroy(y,ierr)
  CHKERRQ(ierr)

!  deallocate(app,stat=ierr0) 
!  IF (ierr0 /= 0) STOP "*** app ***"
  CHKERRQ(ierr)

  call PetscTime(tf,ierr)

  tt = tf - ti

  print*, "Total time:", tt

  call PetscFinalize(ierr)
  call MPI_Finalize(ierrm)

  call EXIT() 	!abort program at this point. 

END PROGRAM solvelinearmgpetsc

!Subroutines from example42 KSP-PETSc:

!/***********************************************************************/
!/*          Routines for a user-defined shell preconditioner           */
!/***********************************************************************/

!
!   SampleShellPCSetUp - This routine sets up a user-defined
!   preconditioner context.
!
!   Input Parameters:
!   pc    - preconditioner object
!   x     - vector
!
!   Output Parameter:
!   ierr  - error code (nonzero if error has been detected)
!
!   Notes:
!   In this example, we define the shell preconditioner to be Jacobi
!   method.  Thus, here we create a work vector for storing the reciprocal
!   of the diagonal of the preconditioner matrix; this vector is then
!   used within the routine SampleShellPCApply().
!
!	subroutine SampleShellPCSetUp(pc,x,ierr)
!	implicit none

!#include <petsc/finclude/petscsys.h>
!#include <petsc/finclude/petscvec.h>
!#include <petsc/finclude/petscvec.h90>
!#include <petsc/finclude/petscmat.h>
!#include <petsc/finclude/petscpc.h>    
!#include <petsc/finclude/petscksp.h>  


!       PC      pc
!       Vec     x
!       Mat     pmat
!       PetscErrorCode ierr
!       PetscInt	nl,k,finest
!       DM	da_list,daclist
!       Mat      R 
 

!  Common block to store data for user-provided preconditioner
!       common /mypcs/ jacobi,mg,work
!       PC jacobi,mg
!       Vec work

!! adapting from example 42 of ksp - Petsc

!     call PCGetOperators(pc,PETSC_NULL_OBJECT,pmat,ierr)
!     call PCCreate(PETSC_COMM_WORLD,jacobi,ierr)
!    call PCSetType(jacobi,PCJACOBI,ierr)
!     call PCSetOperators(jacobi,pmat,pmat,ierr)
!     call PCSetUp(jacobi,ierr)

     

 ! .- SOR "2nd level" Preconditioner - as Ex42-ksp

!     call PCSetType(mg,PCSOR,ierr)		!This was PCSOR before
!     call PCSetOperators(mg,pmat,pmat,ierr)
!     call PCSORSetSymmetric(mg,SOR_LOCAL_SYMMETRIC_SWEEP,ierr)

 ! 3.- MULTIGRID SETUP (PCGAMG):     !THIS IS PCAMG = ALGEBRAIC MG (same results than MG)


!     call PCGetOperators(pc,PETSC_NULL_OBJECT,pmat,ierr)
!     call PCCreate(PETSC_COMM_WORLD,mg,ierr)
!     call PCSetType(mg,PCGAMG,ierr)
!     call PCSetOperators(mg,pmat,pmat,ierr)
!     call PCGAMGSetNLevels(mg,3,PETSC_NULL_OBJECT,ierr)
!     call PCGAMGSetNSmooths(mg,1,ierr)
!     call PCGAMGSetThreshold(mg,0.0d0,ierr)
!     call PCSetUp(mg,ierr)

!     call PCApply(mg,x,y,ierr)
!     call VecAXPY(y,one,x,ierr)


!!!! Endof

!     call VecDuplicate(x,work,ierr)

!      end

! -------------------------------------------------------------------
!
!   SampleShellPCApply - This routine demonstrates the use of a
!   user-provided preconditioner.
!
!   Input Parameters:
!   pc - preconditioner object
!   x - input vector
!
!   Output Parameters:
!   y - preconditioned vector
!   ierr  - error code (nonzero if error has been detected)
!
!   Notes:
!   This code implements the Jacobi preconditioner plus the
!   SOR preconditioner
!
! YOU CAN GET THE EXACT SAME EFFECT WITH THE PCCOMPOSITE preconditioner using
! mpiexec -n 1 ex21f -ksp_monitor -pc_type composite -pc_composite_pcs jacobi,sor -pc_composite_type additive
!
!       subroutine SampleShellPCApply(pc,x,y,ierr)
!       implicit none

!#include <petsc/finclude/petscsys.h>
!#include <petsc/finclude/petscvec.h>
!#include <petsc/finclude/petscpc.h>

!       PC      pc
!       Vec     x,y
!       PetscErrorCode ierr
!       PetscScalar  one

!  Common block to store data for user-provided preconditioner
!       common /mypcs/ jacobi,mg,work
!       PC  jacobi,mg
!       Vec work

!       one = 1.0
!       call PCApply(jacobi,x,y,ierr)
!       call PCApply(mg,x,work,ierr)
!       call VecAXPY(y,one,work,ierr)

!       end



Reply via email to