Thank you very much for offering to debug.
I built PETSc along with AMReX, so I tried to extract the PETSc code alone
which would reproduce the same error on the smallest sized problem possible.
I have attached three files:
petsc_amrex_error_redistribute.txt – The error message from amrex/petsc
interface, but THE linear system solves and converges to a solution.
problem.c – A simple stand-alone petsc code, which produces almost the same
error message.
petsc_ error_redistribute.txt – The error message from problem.c but strangely
it does NOT solve – I am not sure why?
Please use problem.c to debug the issue.
Kind regards,
Karthik.
From: Barry Smith <[email protected]>
Date: Saturday, 4 February 2023 at 00:22
To: Chockalingam, Karthikeyan (STFC,DL,HC) <[email protected]>
Cc: [email protected] <[email protected]>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros
If you can help me reproduce the problem with a simple code I can debug the
problem and fix it.
Barry
On Feb 3, 2023, at 6:42 PM, Karthikeyan Chockalingam - STFC UKRI
<[email protected]> wrote:
I updated the main branch to the below commit but the same problem persists.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92 GIT
Date: 2023-02-03 18:41:48 +0000
From: Barry Smith <[email protected]<mailto:[email protected]>>
Date: Friday, 3 February 2023 at 18:51
To: Chockalingam, Karthikeyan (STFC,DL,HC)
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
<[email protected]<mailto:[email protected]>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros
If you switch to use the main branch of petsc
https://petsc.org/release/install/download/#advanced-obtain-petsc-development-version-with-git
you will not have the problem below (previously we required that a row exist
before we zeroed it but now we allow the row to initially have no entries and
still be zeroed.
Barry
On Feb 3, 2023, at 1:04 PM, Karthikeyan Chockalingam - STFC UKRI
<[email protected]<mailto:[email protected]>>
wrote:
Thank you. The entire error output was an attachment in my previous email. I am
pasting here for your reference.
[1;31m[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (65792)
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be
the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-options_left (no value)
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-127-ga207d08eda GIT
Date: 2022-10-30 11:03:25 -0500
[0]PETSC ERROR: /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM
on a named HC20210312 by karthikeyan.chockalingam Fri Feb 3 11:10:01 2023
[0]PETSC ERROR: Configure options --with-debugging=0
--prefix=/Users/karthikeyan.chockalingam/AMReX/petsc --download-fblaslapack=yes
--download-scalapack=yes --download-mumps=yes
--with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre
[0]PETSC ERROR: #1 MatZeroRowsColumns_SeqAIJ() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/seq/aij.c:2218
[0]PETSC ERROR: #2 MatZeroRowsColumns() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6085
[0]PETSC ERROR: #3 MatZeroRowsColumns_MPIAIJ() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/mpi/mpiaij.c:879
[0]PETSC ERROR: #4 MatZeroRowsColumns() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6085
[0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6124
[0]PETSC ERROR: #6 localAssembly() at
/Users/karthikeyan.chockalingam/AMReX/amrFEM/src/FENodalPoisson.cpp:435
Residual norms for redistribute_ solve.
0 KSP preconditioned resid norm 5.182603110407e+00 true resid norm
1.382027496109e+01 ||r(i)||/||b|| 1.000000000000e+00
1 KSP preconditioned resid norm 1.862430383976e+00 true resid norm
4.966481023937e+00 ||r(i)||/||b|| 3.593619546588e-01
2 KSP preconditioned resid norm 2.132803507689e-01 true resid norm
5.687476020503e-01 ||r(i)||/||b|| 4.115313216645e-02
3 KSP preconditioned resid norm 5.499797533437e-02 true resid norm
1.466612675583e-01 ||r(i)||/||b|| 1.061203687852e-02
4 KSP preconditioned resid norm 2.829814271435e-02 true resid norm
7.546171390493e-02 ||r(i)||/||b|| 5.460217985345e-03
5 KSP preconditioned resid norm 7.431048995318e-03 true resid norm
1.981613065418e-02 ||r(i)||/||b|| 1.433844891652e-03
6 KSP preconditioned resid norm 3.182040728972e-03 true resid norm
8.485441943932e-03 ||r(i)||/||b|| 6.139850305312e-04
7 KSP preconditioned resid norm 1.030867020459e-03 true resid norm
2.748978721225e-03 ||r(i)||/||b|| 1.989091193167e-04
8 KSP preconditioned resid norm 4.469429300003e-04 true resid norm
1.191847813335e-03 ||r(i)||/||b|| 8.623908111021e-05
9 KSP preconditioned resid norm 1.237303313796e-04 true resid norm
3.299475503456e-04 ||r(i)||/||b|| 2.387416685085e-05
10 KSP preconditioned resid norm 5.822094326756e-05 true resid norm
1.552558487134e-04 ||r(i)||/||b|| 1.123391894522e-05
11 KSP preconditioned resid norm 1.735776150969e-05 true resid norm
4.628736402585e-05 ||r(i)||/||b|| 3.349236115503e-06
Linear redistribute_ solve converged due to CONVERGED_RTOL iterations 11
KSP Object: (redistribute_) 1 MPI process
type: cg
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: (redistribute_) 1 MPI process
type: jacobi
type DIAGONAL
linear system matrix = precond matrix:
Mat Object: 1 MPI process
type: mpiaij
rows=48896, cols=48896
total: nonzeros=307976, allocated nonzeros=307976
total number of mallocs used during MatSetValues calls=0
not using I-node (on process 0) routines
End of program
solve time 0.564714744 seconds
Starting max value is: 0
Min value of level 0 is: 0
Interpolated min value is: 741.978761
Unused ParmParse Variables:
[TOP]::model.type(nvals = 1) :: [3]
[TOP]::ref_ratio(nvals = 1) :: [2]
AMReX (22.10-20-g3082028e4287) finalized
#PETSc Option Table entries:
-ksp_type preonly
-options_left
-pc_type redistribute
-redistribute_ksp_converged_reason
-redistribute_ksp_monitor_true_residual
-redistribute_ksp_type cg
-redistribute_ksp_view
-redistribute_pc_type jacobi
#End of PETSc Option Table entries
There are no unused options.
Program ended with exit code: 0
Best,
Karthik.
From: Barry Smith <[email protected]<mailto:[email protected]>>
Date: Friday, 3 February 2023 at 17:41
To: Chockalingam, Karthikeyan (STFC,DL,HC)
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
<[email protected]<mailto:[email protected]>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros
We need all the error output for the errors you got below to understand why
the errors are happening.
On Feb 3, 2023, at 11:41 AM, Karthikeyan Chockalingam - STFC UKRI
<[email protected]<mailto:[email protected]>>
wrote:
Hello Barry,
I would like to better understand pc_type redistribute usage.
I am plan to use pc_type redistribute in the context of adaptive mesh
refinement on a structured grid in 2D. My base mesh (level 0) is indexed from 0
to N-1 elements and refined mesh (level 1) is indexed from 0 to 4(N-1)
elements. When I construct system matrix A on (level 1); I probably only use
20% of 4(N-1) elements, however the indexes are scattered in the range of 0 to
4(N-1). That leaves 80% of the rows and columns of the system matrix A on
(level 1) to be zero. From your earlier response, I believe this would be a use
case for petsc_type redistribute.
Indeed the linear solve will be more efficient if you use the redistribute
solver.
But I don't understand your plan. With adaptive refinement I would just
create the two matrices, one for the initial grid on which you solve the
system, this will be a smaller matrix and then create a new larger matrix for
the refined grid (and discard the previous matrix).
Question (1)
If N is really large, I would have to allocate memory of size 4(N-1) for the
system matrix A on (level 1). How does pc_type redistribute help? Because, I
did end up allocating memory for a large system, where most of the rows and
columns are zeros. Is most of the allotted memory not wasted? Is this the
correct usage?
See above
Question (2)
I tried using pc_type redistribute for a two level system.
I have attached the output only for (level 1)
The solution converges to right solution but still petsc outputs some error
messages.
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be
the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-options_left (no value)
But the there were no unused options
#PETSc Option Table entries:
-ksp_type preonly
-options_left
-pc_type redistribute
-redistribute_ksp_converged_reason
-redistribute_ksp_monitor_true_residual
-redistribute_ksp_type cg
-redistribute_ksp_view
-redistribute_pc_type jacobi
#End of PETSc Option Table entries
There are no unused options.
Program ended with exit code: 0
I cannot explain this
Question (2)
[0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (65792)
What does this error message imply? Given I only use 20% of 4(N-1) indexes, I
can imagine most of the diagonal entrees are zero. Is my understanding correct?
Question (3)
[0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6124
I am using MatZeroRowsColumnsIS to set the homogenous Dirichelet boundary. I
don’t follow why I get this error message as the linear system converges to the
right solution.
Thank you for your help.
Kind regards,
Karthik.
From: Barry Smith <[email protected]<mailto:[email protected]>>
Date: Tuesday, 10 January 2023 at 18:50
To: Chockalingam, Karthikeyan (STFC,DL,HC)
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
<[email protected]<mailto:[email protected]>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros
Yes, after the solve the x will contain correct values for ALL the locations
including the (zeroed out rows). You use case is exactly what redistribute it
for.
Barry
On Jan 10, 2023, at 11:25 AM, Karthikeyan Chockalingam - STFC UKRI
<[email protected]<mailto:[email protected]>>
wrote:
Thank you Barry. This is great!
I plan to solve using ‘-pc_type redistribute’ after applying the Dirichlet bc
using
MatZeroRowsColumnsIS(A, isout, 1, x, b);
While I retrieve the solution data from x (after the solve) – can I index them
using the original ordering (if I may say that)?
Kind regards,
Karthik.
From: Barry Smith <[email protected]<mailto:[email protected]>>
Date: Tuesday, 10 January 2023 at 16:04
To: Chockalingam, Karthikeyan (STFC,DL,HC)
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
<[email protected]<mailto:[email protected]>>
Subject: Re: [petsc-users] Eliminating rows and columns which are zeros
https://petsc.org/release/docs/manualpages/PC/PCREDISTRIBUTE/#pcredistribute
-pc_type redistribute
It does everything for you. Note that if the right hand side for any of the
"zero" rows is nonzero then the system is inconsistent and the system does not
have a solution.
Barry
On Jan 10, 2023, at 10:30 AM, Karthikeyan Chockalingam - STFC UKRI via
petsc-users <[email protected]<mailto:[email protected]>> wrote:
Hello,
I am assembling a MATIJ of size N, where a very large number of rows (and
corresponding columns), are zeros. I would like to potentially eliminate them
before the solve.
For instance say N=7
0 0 0 0 0 0 0
0 1 -1 0 0 0 0
0 -1 2 0 0 0 -1
0 0 0 0 0 0 0
0 0 0 0 0 0 0
0 0 0 0 0 0 0
0 0 -1 0 0 0 1
I would like to reduce it to a 3x3
1 -1 0
-1 2 -1
0 -1 1
I do know the size N.
Q1) How do I do it?
Q2) Is it better to eliminate them as it would save a lot of memory?
Q3) At the moment, I don’t know which rows (and columns) have the zero entries
but with some effort I probably can find them. Should I know which rows (and
columns) I am eliminating?
Thank you.
Karthik.
This email and any attachments are intended solely for the use of the named
recipients. If you are not the intended recipient you must not use, disclose,
copy or distribute this email or any of its attachments and should notify the
sender immediately and delete this email from your system. UK Research and
Innovation (UKRI) has taken every reasonable precaution to minimise risk of
this email or any attachments containing viruses or malware but the recipient
should carry out its own virus and malware checks before opening the
attachments. UKRI does not accept any liability for any losses or damages which
the recipient may sustain due to presence of any viruses.
<petsc_redistribute.txt>
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (20)
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f924 GIT
Date: 2023-02-03 18:41:48 +0000
[0]PETSC ERROR: ./problem on a arch-darwin-c-opt named HC20210312 by
karthikeyan.chockalingam Sat Feb 4 16:47:05 2023
[0]PETSC ERROR: Configure options --with-debugging=0 --download-fblaslapack=yes
--download-scalapack=yes --download-mumps=yes
[0]PETSC ERROR: #1 MatZeroRowsColumns_SeqAIJ() at
/Users/karthikeyan.chockalingam/PETSc/petsc/src/mat/impls/aij/seq/aij.c:2225
[0]PETSC ERROR: #2 MatZeroRowsColumns() at
/Users/karthikeyan.chockalingam/PETSc/petsc/src/mat/interface/matrix.c:6109
[0]PETSC ERROR: #3 MatZeroRowsColumns_MPIAIJ() at
/Users/karthikeyan.chockalingam/PETSc/petsc/src/mat/impls/aij/mpi/mpiaij.c:879
[0]PETSC ERROR: #4 MatZeroRowsColumns() at
/Users/karthikeyan.chockalingam/PETSc/petsc/src/mat/interface/matrix.c:6109
[0]PETSC ERROR: #5 main() at problem.c:98
[0]PETSC ERROR: No PETSc Option Table entries
[0]PETSC ERROR: ----------------End of Error Message -------send entire error
message to [email protected]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
with errorcode 73.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
[0;39m[0;49m[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry in row 0 (20)
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be
the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-options_left (no value) source: code
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92 GIT
Date: 2023-02-03 18:41:48 +0000
[0]PETSC ERROR: /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM
on a named HC20210312 by karthikeyan.chockalingam Sat Feb 4 11:42:53 2023
[0]PETSC ERROR: Configure options --with-debugging=0
--prefix=/Users/karthikeyan.chockalingam/AMReX/petsc --download-fblaslapack=yes
--download-scalapack=yes --download-mumps=yes
--with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre
[0]PETSC ERROR: #1 MatZeroRowsColumns_SeqAIJ() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/seq/aij.c:2225
[0]PETSC ERROR: #2 MatZeroRowsColumns() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6109
[0]PETSC ERROR: #3 MatZeroRowsColumns_MPIAIJ() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/impls/aij/mpi/mpiaij.c:879
[0]PETSC ERROR: #4 MatZeroRowsColumns() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6109
[0]PETSC ERROR: #5 MatZeroRowsColumnsIS() at
/Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/mat/interface/matrix.c:6148
[0]PETSC ERROR: #6 localAssembly() at
/Users/karthikeyan.chockalingam/AMReX/amrFEM/src/FENodalPoisson.cpp:444
Residual norms for redistribute_ solve.
0 KSP preconditioned resid norm 2.715290039756e+02 true resid norm
7.240773439350e+02 ||r(i)||/||b|| 1.000000000000e+00
1 KSP preconditioned resid norm 5.430580079513e+01 true resid norm
1.448154687870e+02 ||r(i)||/||b|| 2.000000000000e-01
2 KSP preconditioned resid norm 1.865174681370e-14 true resid norm
4.973799150321e-14 ||r(i)||/||b|| 6.869154506742e-17
Linear redistribute_ solve converged due to CONVERGED_RTOL iterations 2
KSP Object: (redistribute_) 1 MPI process
type: cg
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: (redistribute_) 1 MPI process
type: jacobi
type DIAGONAL
linear system matrix = precond matrix:
Mat Object: 1 MPI process
type: mpiaij
rows=8, cols=8
total: nonzeros=32, allocated nonzeros=32
total number of mallocs used during MatSetValues calls=0
not using I-node (on process 0) routines
End of program
solve time 0.018713176 seconds
Starting max value is: 0
Min value of level 0 is: 0
Interpolated min value is: 789.9428571
Unused ParmParse Variables:
[TOP]::model.type(nvals = 1) :: [3]
[TOP]::ref_ratio(nvals = 1) :: [2]
AMReX (22.10-20-g3082028e4287) finalized
#PETSc Option Table entries:
-ksp_type preonly # (source: code)
-options_left # (source: code)
-pc_type redistribute # (source: code)
-redistribute_ksp_converged_reason # (source: code)
-redistribute_ksp_monitor_true_residual # (source: code)
-redistribute_ksp_type cg # (source: code)
-redistribute_ksp_view # (source: code)
-redistribute_pc_type jacobi # (source: code)
#End of PETSc Option Table entries
There are no unused options.
Program ended with exit code: 0
#include <petscksp.h>
#include <petscdm.h>
#include <petscdmda.h>
#include <petsc.h>
#include <petscsnes.h>
#include "petscsys.h"
#include "petscis.h"
#include <stdio.h>
int GlobalToLocal(int x, int y)
{
if((x == 0) && (y == 0)) return 0;
if((x == 1) && (y == 0)) return 1;
if((x == 0) && (y == 1)) return 2;
if((x == 1) && (y == 1)) return 3;
}
int main(int argc, char *argv[])
{
PetscInitialize(&argc, &argv, PETSC_NULLPTR, PETSC_NULLPTR);
int rank, size;
MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
MPI_Comm_size(PETSC_COMM_WORLD, &size);
double kloc[16];
kloc[0] = 0.6666666667; kloc[1] = -0.1666666667; kloc[2] = -0.1666666667;
kloc[3] = -0.3333333333;
kloc[4] = -0.1666666667; kloc[5] = 0.6666666667; kloc[6] = -0.3333333333;
kloc[7] = -0.1666666667;
kloc[8] = -0.1666666667; kloc[9] = -0.3333333333; kloc[10] = 0.6666666667;
kloc[11] = -0.1666666667;
kloc[12] = -0.3333333333; kloc[13] = -0.1666666667; kloc[14] =
-0.1666666667; kloc[15] = 0.6666666667;
double floc[4];
floc[0] = 64; floc[1] = 64; floc[2] = 64; floc[3] = 64;
int bnd[17];
bnd[0] = 20; bnd[1] = 21; bnd[2] = 22; bnd[3] = 23; bnd[4] = 24;
bnd[5] = 56; bnd[6] = 57; bnd[7] = 58; bnd[8] = 59; bnd[9] = 60;
bnd[10] = 29; bnd[11] = 38; bnd[12] = 47;
bnd[13] = 33; bnd[14] = 42; bnd[15] = 51;
bnd[16] = 40;
PetscErrorCode ierr;
PetscInt N = 81;
PetscInt d_nz = 10;
PetscInt o_nz = 10;
KSP ksp;
Vec x, b;
Mat A;
ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr);
ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr);
ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr);
ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr);
ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr);
ierr = MatCreateVecs(A, &b, &x);CHKERRQ(ierr);
int ncells = 8;
int nodes[4];
PetscInt i, j, nx, ny;
for (j = 2; j < 6; j++)
{
for (i = 2; i < 6; i++)
{
for (ny = 0; ny < 2; ny++)
{
for (nx = 0; nx < 2; nx++)
{
int ni, nj, gobaln;
ni = i + nx;
nj = j + ny;
// one index is used to refer a global node
gobaln = ni * (ncells + 1) + nj;
//printf("%d\t%d\t%d\n", i, j, gobaln);
nodes[GlobalToLocal(nx, ny)] = gobaln;
}
}
MatSetValues(A, 4, nodes, 4, nodes, kloc, ADD_VALUES);
VecSetValues(b, 4, nodes, floc, ADD_VALUES);
}
}
ierr = MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
ierr = MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
ierr = VecAssemblyBegin(b); CHKERRQ(ierr);
ierr = VecAssemblyEnd(b); CHKERRQ(ierr);
ierr = MatZeroRowsColumns(A, 17, bnd, 1, x, b); CHKERRQ(ierr);
KSPCreate(PETSC_COMM_WORLD, &ksp);
KSPSetOperators(ksp, A, A);
ierr = PetscOptionsSetValue(NULL,"-ksp_type", "preonly"); CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL,"-pc_type", "redistribute"); CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL,"-redistribute_ksp_type", "cg");
CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL,"-redistribute_pc_type", "jacobi");
CHKERRQ(ierr);
//ierr = PetscOptionsSetValue(NULL, "-ksp_monitor", NULL);CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL, "-redistribute_ksp_view",
NULL);CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL,
"-redistribute_ksp_monitor_true_residual", NULL);CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL, "-redistribute_ksp_converged_reason",
NULL);CHKERRQ(ierr);
ierr = PetscOptionsSetValue(NULL, "-options_left", NULL);CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr);
KSPSolve(ksp, b, x);
PetscFinalize();
return 0;
}