Re: [petsc-users] A number of questions about DMDA with SNES and Quasi-Newton methods

2017-10-02 Thread zakaryah .
I'm still working on this.  I've made some progress, and it looks like the
issue is with the KSP, at least for now.  The Jacobian may be
ill-conditioned.  Is it possible to use -snes_test_display during an
intermediate step of the analysis?  I would like to inspect the Jacobian
after several solves have already completed, just to make sure there are no
mistakes there.  I tried

ierr = SNESSetType(mp->PETSc_snes, SNESTEST);CHKERRQ(ierr);



ierr = PetscOptionsSetValue(NULL, "-snes_test_display", "");
CHKERRQ(ierr);

and the first line works, of course, but the second line doesn't seem to
activate the printing of the Jacobian.  I also tried it with "true" in the
last argument and that didn't work either.

On Tue, Sep 5, 2017 at 9:39 AM, Jed Brown  wrote:

> "zakaryah ."  writes:
>
> > OK - I've checked the Jacobian and function very thoroughly and I am
> > confident there are no errors.
>
> Does Newton converge quadratically when you have a good enough initial
> guess?
>
> Globalization of large deformation elasticity is a persistent
> engineering challenge.  The standard approach is to use a continuation,
> often in the form of load increments.
>
> Regarding trust region documentation, the man page says
>
>The basic algorithm is taken from "The Minpack Project", by More',
>Sorensen, Garbow, Hillstrom, pages 88-111 of "Sources and Development
>of Mathematical Software", Wayne Cowell, editor.
>
> You should be able to make sense of it reading from any other source on
> trust region methods.
>
> > I suspect that I am having problems with a bad starting point, and the
> SNES
> > cannot find the global minimum from there.  I know that the global
> minimum
> > (with residual zero) exists in all cases but I would like the methods for
> > finding it to be more robust to the starting value.
> >
> > The problem comes from the physics of finite deformations of elastic
> > materials.  In short, I have a functional of smooth maps on a 3D domain
> to
> > itself.  The functional contains two terms.  The first term represents
> > forces which come from external data, and in the Lagrangian this term
> only
> > involves the values of the map at the point in question.  The second term
> > penalizes fluctuations in the map, and can take various forms.  The
> > simplest form is just the Dirichlet energy, but I'm also interested in
> the
> > infinitesimal strain energy and the finite strain energy.  The first two
> > have terms in the Lagrangian which are first order in the second spatial
> > derivatives of the map, while the third (finite strain energy) has terms
> > which are up to third order in the first and second spatial derivatives
> of
> > the map.  It is the finite strain energy term which has been problematic.
> >
> > The Euler-Lagrange equations are discretized on a cubic grid, with equal
> > interval spacing in each dimension.  The map is the dependent variable,
> > i.e. the x in F(x) = 0.  I prefer Neumann boundary conditions.  Because
> the
> > spatial derivatives of the map are usually small, the Jacobian typically
> > has large values in 3x3 blocks along the diagonal (which depend on the
> map
> > and the external data), and up to 54 values which are functions of the
> > spatial derivatives of the map and tend to be smaller.
> >
> > Do you have any advice on diagnosing and improving situations in which
> > Newton's method finds a stationary point that is not the state with
> > globally minimal residual?  One hint is that -snes_type newtonls does not
> > find as good a solution as -snes_type newtontr but I don't know much
> about
> > these trust region methods, or how to customize and assess them.  I'm
> > grateful for any advice.
> >
> > On Mon, Sep 4, 2017 at 5:44 PM, zakaryah .  wrote:
> >
> >> Yes, it looks like it IS the other way around, and I think the row is
> >>
> >> r.c + r.i*3 + r.j*3*M + r.k*3*M*N, where r.i is in [0,M-1], r.j is in
> >> [0,N-1], and r.k is in [0,P-1].
> >>
> >> That matches the boundary conditions in the displayed Jacobian.
> >>
> >> On Mon, Sep 4, 2017 at 5:33 PM, Barry Smith  wrote:
> >>
> >>>
> >>> > On Sep 4, 2017, at 4:09 PM, zakaryah .  wrote:
> >>> >
> >>> > OK that is super helpful.  Just to be sure - for MxNxP, the row r in
> >>> the Jacobian is at r.i*P*N*3 + r.j*P*3 + r.k*3 + r.c?
> >>>
> >>>   It is that way, or the other way around r.k*M*N*3 + r.j*N*3 + r.k*3 +
> >>> r.c
> >>> >
> >>> >
> >>> > On Mon, Sep 4, 2017 at 4:58 PM, Barry Smith 
> wrote:
> >>> >
> >>> > > On Sep 4, 2017, at 3:48 PM, zakaryah .  wrote:
> >>> > >
> >>> > > One piece of information that would be useful is what ordering
> PETSc
> >>> uses for the Jacobian in the snes_test_display.  Is it a natural
> ordering,
> >>> or the PETSc ordering?  For debugging the Jacobian manually, the
> natural
> >>> ordering is much easier to work with.
> >>> >
> >>> >What is displayed is always the natural ordering (internally it is
> >>> not the natural ordering).
> >>> >
> >>> > >  For -n 1, ar

Re: [petsc-users] Mat/Vec with empty ranks

2017-10-02 Thread Florian Lindner


Am 02.10.2017 um 21:04 schrieb Matthew Knepley:
> On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner  > wrote:
> 
> Hello,
> 
> I have a matrix and vector that live on 4 ranks, but only rank 2 and 3 
> have values:
> 
> Doing a simple LSQR solve does not converge. However, when the values are 
> distributed equally, it converges within 3
> iterations.
> 
> What can I do about that?
> 
> I have attached a simple program and creates the matrix and vector or 
> loads them from a file.
> 
> 
> There are a few problems with this program. I am attaching a cleaned up 
> version. However, convergence still differs starting
> at iteration 2. It appears that LSQR has a problem with this system, or we 
> have a bug that I cannot see.

Thanks for having a look at it!

And good to hear it's not by design. If I can be of any more help tracking this 
down, pleae let me know.

In the meantime, what could be a good way to work around this? This is 
admittedly a very malformed example. Is there a
way to force solving on a single CPU and then distribute the results resp. KSP 
object to the original parallel layout?
Of course, we would first try to solve in parallel, but we have little 
influence about the actual parallel layout, since
we are just a library and other solvers give us the data.

Best,
Florian


Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
Well that is strange, the PETSc tests work.

Wenbo, could you please:

> git clone -b gamg-fix-eig-err https://bitbucket.org/petsc/petsc petsc2
> cd petsc2

and reconfigure, make, and then run your test without the
-st_gamg_est_ksp_error_if_not_converged 0 fix, and see if this fixes the
problem.

Don't forget to set PETSC_DIR=/petsc2

If you have time and this works, you could do a 'git checkout master' and
remake, and retest. You should not have to reconfigure. I have tested
master on petsc tests. I don't understand how this happened.

Thanks,
Mark


On Mon, Oct 2, 2017 at 12:32 PM, Wenbo Zhao 
wrote:

>
>
> On Tue, Oct 3, 2017 at 12:23 AM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao 
>> wrote:
>>
>>>
>>>
>>> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams  wrote:
>>>
 Whenbo, do you build your PETSc?

 Yes.
>>> My configure option is listed below
>>> ./configure --with-mpi=1 --with-shared-libraries=1 \
>>> --with-64-bit-indices=1 --with-debugging=1
>>>
>>> And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc.
>>>
>>>
>>> The Makefile for my problem is listed below,
>>>
>>> PETSC_ARCH = arch-linux2-c-debug
>>> PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git
>>> SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git
>>> #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4
>>> #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3
>>> HYPRE_DIR = /usr/local/hypre
>>> #
>>> DEBUG_OPT  = -g
>>> COMP_FLAGS = -fPIC  -Wall  \
>>>   -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \
>>>   -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \
>>>   -Isrc
>>>
>>> LINK_FLAGS = -fPIC -Wall  \
>>>   -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib
>>> -L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \
>>>   -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib
>>> -L${PETSC_DIR}/${PETSC_ARCH}/lib  -lpetsc \
>>>   -llapack -lblas -lhwloc -lm -lgfortran  -lquadmath
>>>
>>> step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o
>>> mpicxx -o step-41 $^  ${LINK_FLAGS} ${DEBUG_OPT}
>>>
>>> src/main.o: src/main.c
>>> mpicxx -o src/main.o -c  $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>>
>>> src/readinp.o: src/readinp.c
>>> mpicxx -o src/readinp.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>>
>>> src/sp3.o: src/sp3.c
>>> mpicxx -o src/sp3.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>>
>>> src/diffu.o: src/diffu.c
>>> mpicxx -o src/diffu.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>>
>>> src/base.o: src/base.c
>>> mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT}
>>>
>>>
>>> clean:
>>> rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o
>>>
>>> runkr_smooth:
>>> mpirun -n ${NCORE} ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>>>-st_gamg_est_ksp_converged_reason \
>>>
>>
>> Add -st_gamg_est_ksp_error_if_not_converged 0
>>
>>   Thanks,
>>
>>  Matt
>>
>
> It works after adding -st_gamg_est_ksp_error_if_not_converged 0.
>
> Thanks,
> Wenbo
>
>
>>
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>>
>>> runkr_nonsmooth:
>>> mpirun -n ${NCORE} ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>>>-st_gamg_est_ksp_converged_reason \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>>
>>> Thanks,
>>> Wenbo
>>>
>>>
>>>
>>>
 On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams  wrote:

> This is normal:
>
> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS
> iterations 10
>
> It looks like ksp->errorifnotconverged got set somehow. If the
> default changed in KSP then (SAGG) GAMG would not ever work.
>
> I assume you don't have a .petscrc file with more (crazy) options in
> it ...
>
>
> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao 
> wrote:
>
>>
>>
>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley 
>> wrote:
>>
>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:
>>>
 non-smoothed aggregation is converging very fast. smoothed fails in
 the eigen estimator.

 Run this again with -st_gamg_est_ksp_view and
 -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% 
 sure
 about these args).

>>>
>>> I also want -st_gamg_est_ksp_converged_reason
>>>
>>>   Thanks,
>>>
>>> Matt
>>>
>> $make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
On Tue, Oct 3, 2017 at 12:23 AM, Matthew Knepley  wrote:

> On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao 
> wrote:
>
>>
>>
>> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams  wrote:
>>
>>> Whenbo, do you build your PETSc?
>>>
>>> Yes.
>> My configure option is listed below
>> ./configure --with-mpi=1 --with-shared-libraries=1 \
>> --with-64-bit-indices=1 --with-debugging=1
>>
>> And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc.
>>
>>
>> The Makefile for my problem is listed below,
>>
>> PETSC_ARCH = arch-linux2-c-debug
>> PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git
>> SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git
>> #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4
>> #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3
>> HYPRE_DIR = /usr/local/hypre
>> #
>> DEBUG_OPT  = -g
>> COMP_FLAGS = -fPIC  -Wall  \
>>   -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \
>>   -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \
>>   -Isrc
>>
>> LINK_FLAGS = -fPIC -Wall  \
>>   -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib
>> -L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \
>>   -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib
>> -L${PETSC_DIR}/${PETSC_ARCH}/lib  -lpetsc \
>>   -llapack -lblas -lhwloc -lm -lgfortran  -lquadmath
>>
>> step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o
>> mpicxx -o step-41 $^  ${LINK_FLAGS} ${DEBUG_OPT}
>>
>> src/main.o: src/main.c
>> mpicxx -o src/main.o -c  $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>
>> src/readinp.o: src/readinp.c
>> mpicxx -o src/readinp.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>
>> src/sp3.o: src/sp3.c
>> mpicxx -o src/sp3.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>
>> src/diffu.o: src/diffu.c
>> mpicxx -o src/diffu.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>>
>> src/base.o: src/base.c
>> mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT}
>>
>>
>> clean:
>> rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o
>>
>> runkr_smooth:
>> mpirun -n ${NCORE} ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>>-st_gamg_est_ksp_converged_reason \
>>
>
> Add -st_gamg_est_ksp_error_if_not_converged 0
>
>   Thanks,
>
>  Matt
>

It works after adding -st_gamg_est_ksp_error_if_not_converged 0.

Thanks,
Wenbo


>
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>
>> runkr_nonsmooth:
>> mpirun -n ${NCORE} ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>>-st_gamg_est_ksp_converged_reason \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>
>>
>> Thanks,
>> Wenbo
>>
>>
>>
>>
>>> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams  wrote:
>>>
 This is normal:

 Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS
 iterations 10

 It looks like ksp->errorifnotconverged got set somehow. If the default
 changed in KSP then (SAGG) GAMG would not ever work.

 I assume you don't have a .petscrc file with more (crazy) options in it
 ...


 On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao 
 wrote:

>
>
> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:
>>
>>> non-smoothed aggregation is converging very fast. smoothed fails in
>>> the eigen estimator.
>>>
>>> Run this again with -st_gamg_est_ksp_view and
>>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% 
>>> sure
>>> about these args).
>>>
>>
>> I also want -st_gamg_est_ksp_converged_reason
>>
>>   Thanks,
>>
>> Matt
>>
> $make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>-st_gamg_est_ksp_converged_reason \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> Thanks
> Wenbo
>
>
>>
>>
>>>
>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao <
>>> zhaowenbo.n...@gmail.com> wrote:
>>>
 Matt,

 Test 1 nonsmooth
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
 runkr_nonsmooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao 
wrote:

>
>
> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams  wrote:
>
>> Whenbo, do you build your PETSc?
>>
>> Yes.
> My configure option is listed below
> ./configure --with-mpi=1 --with-shared-libraries=1 \
> --with-64-bit-indices=1 --with-debugging=1
>
> And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc.
>
>
> The Makefile for my problem is listed below,
>
> PETSC_ARCH = arch-linux2-c-debug
> PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git
> SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git
> #PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4
> #SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3
> HYPRE_DIR = /usr/local/hypre
> #
> DEBUG_OPT  = -g
> COMP_FLAGS = -fPIC  -Wall  \
>   -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \
>   -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \
>   -Isrc
>
> LINK_FLAGS = -fPIC -Wall  \
>   -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib  -L${SLEPC_DIR}/${PETSC_ARCH}/lib
> -lslepc \
>   -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib  -L${PETSC_DIR}/${PETSC_ARCH}/lib
> -lpetsc \
>   -llapack -lblas -lhwloc -lm -lgfortran  -lquadmath
>
> step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o
> mpicxx -o step-41 $^  ${LINK_FLAGS} ${DEBUG_OPT}
>
> src/main.o: src/main.c
> mpicxx -o src/main.o -c  $^  ${COMP_FLAGS} ${DEBUG_OPT}
>
> src/readinp.o: src/readinp.c
> mpicxx -o src/readinp.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>
> src/sp3.o: src/sp3.c
> mpicxx -o src/sp3.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>
> src/diffu.o: src/diffu.c
> mpicxx -o src/diffu.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}
>
> src/base.o: src/base.c
> mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT}
>
>
> clean:
> rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o
>
> runkr_smooth:
> mpirun -n ${NCORE} ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>-st_gamg_est_ksp_converged_reason \
>

Add -st_gamg_est_ksp_error_if_not_converged 0

  Thanks,

 Matt

   -eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>
> runkr_nonsmooth:
> mpirun -n ${NCORE} ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-mata AMAT.dat -matb BMAT.dat \
>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>-st_gamg_est_ksp_converged_reason \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
>
> Thanks,
> Wenbo
>
>
>
>
>> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams  wrote:
>>
>>> This is normal:
>>>
>>> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS
>>> iterations 10
>>>
>>> It looks like ksp->errorifnotconverged got set somehow. If the default
>>> changed in KSP then (SAGG) GAMG would not ever work.
>>>
>>> I assume you don't have a .petscrc file with more (crazy) options in it
>>> ...
>>>
>>>
>>> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao 
>>> wrote:
>>>


 On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley 
 wrote:

> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:
>
>> non-smoothed aggregation is converging very fast. smoothed fails in
>> the eigen estimator.
>>
>> Run this again with -st_gamg_est_ksp_view and
>> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% 
>> sure
>> about these args).
>>
>
> I also want -st_gamg_est_ksp_converged_reason
>
>   Thanks,
>
> Matt
>
 $make NCORE=1 runkr_smooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
-mata AMAT.dat -matb BMAT.dat \
-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
-st_gamg_est_ksp_converged_reason \
-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
 makefile:43: recipe for target 'runkr_smooth' failed
 make: *** [runkr_smooth] Error 91

 Thanks
 Wenbo


>
>
>>
>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao > > wrote:
>>
>>> Matt,
>>>
>>> Test 1 nonsmooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
>>> runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>>> 2>&1
>>>
>>> Test 2 smooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
>>> runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -s

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams  wrote:

> Whenbo, do you build your PETSc?
>
> Yes.
My configure option is listed below
./configure --with-mpi=1 --with-shared-libraries=1 \
--with-64-bit-indices=1 --with-debugging=1

And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.bashrc.


The Makefile for my problem is listed below,

PETSC_ARCH = arch-linux2-c-debug
PETSC_DIR = /home/zhaowenbo/research/petsc/petsc_git
SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc_git
#PETSC_DIR = /home/zhaowenbo/research/petsc/petsc-3.7.4
#SLEPC_DIR = /home/zhaowenbo/research/slepc/slepc-3.7.3
HYPRE_DIR = /usr/local/hypre
#
DEBUG_OPT  = -g
COMP_FLAGS = -fPIC  -Wall  \
  -I${SLEPC_DIR}/include -I${SLEPC_DIR}/${PETSC_ARCH}/include \
  -I${PETSC_DIR}/include -I${PETSC_DIR}/${PETSC_ARCH}/include \
  -Isrc

LINK_FLAGS = -fPIC -Wall  \
  -Wl,-rpath,${SLEPC_DIR}/${PETSC_ARCH}/lib
-L${SLEPC_DIR}/${PETSC_ARCH}/lib -lslepc \
  -Wl,-rpath,${PETSC_DIR}/${PETSC_ARCH}/lib
-L${PETSC_DIR}/${PETSC_ARCH}/lib  -lpetsc \
  -llapack -lblas -lhwloc -lm -lgfortran  -lquadmath

step-41: src/main.o src/readinp.o src/base.o src/sp3.o src/diffu.o
mpicxx -o step-41 $^  ${LINK_FLAGS} ${DEBUG_OPT}

src/main.o: src/main.c
mpicxx -o src/main.o -c  $^  ${COMP_FLAGS} ${DEBUG_OPT}

src/readinp.o: src/readinp.c
mpicxx -o src/readinp.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}

src/sp3.o: src/sp3.c
mpicxx -o src/sp3.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}

src/diffu.o: src/diffu.c
mpicxx -o src/diffu.o -c $^  ${COMP_FLAGS} ${DEBUG_OPT}

src/base.o: src/base.c
mpicxx -o src/base.o -c $^ ${COMP_FLAGS} ${DEBUG_OPT}


clean:
rm step-41 src/main.o src/readinp.o src/sp3.o src/diffu.o src/base.o

runkr_smooth:
mpirun -n ${NCORE} ./step-41 \
   -st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
   -mata AMAT.dat -matb BMAT.dat \
   -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
   -st_gamg_est_ksp_converged_reason \
   -eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1

runkr_nonsmooth:
mpirun -n ${NCORE} ./step-41 \
   -st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
   -mata AMAT.dat -matb BMAT.dat \
   -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
   -st_gamg_est_ksp_converged_reason \
   -eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1


Thanks,
Wenbo




> On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams  wrote:
>
>> This is normal:
>>
>> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations
>> 10
>>
>> It looks like ksp->errorifnotconverged got set somehow. If the default
>> changed in KSP then (SAGG) GAMG would not ever work.
>>
>> I assume you don't have a .petscrc file with more (crazy) options in it
>> ...
>>
>>
>> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao 
>> wrote:
>>
>>>
>>>
>>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley 
>>> wrote:
>>>
 On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:

> non-smoothed aggregation is converging very fast. smoothed fails in
> the eigen estimator.
>
> Run this again with -st_gamg_est_ksp_view and
> -st_gamg_est_ksp_monitor, and see if you get more output (I'm not 100% 
> sure
> about these args).
>

 I also want -st_gamg_est_ksp_converged_reason

   Thanks,

 Matt

>>> $make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>>>-st_gamg_est_ksp_converged_reason \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>> Thanks
>>> Wenbo
>>>
>>>


>
> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
> wrote:
>
>> Matt,
>>
>> Test 1 nonsmooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
>> runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>> 2>&1
>>
>> Test 2 smooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>>
>> Thanks,
>>
>

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
Whenbo, do you build your PETSc?

On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams  wrote:

> This is normal:
>
> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations
> 10
>
> It looks like ksp->errorifnotconverged got set somehow. If the default
> changed in KSP then (SAGG) GAMG would not ever work.
>
> I assume you don't have a .petscrc file with more (crazy) options in it
> ...
>
>
> On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao 
> wrote:
>
>>
>>
>> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley 
>> wrote:
>>
>>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:
>>>
 non-smoothed aggregation is converging very fast. smoothed fails in the
 eigen estimator.

 Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
 and see if you get more output (I'm not 100% sure about these args).

>>>
>>> I also want -st_gamg_est_ksp_converged_reason
>>>
>>>   Thanks,
>>>
>>> Matt
>>>
>> $make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>>-st_gamg_est_ksp_converged_reason \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>> Thanks
>> Wenbo
>>
>>
>>>
>>>

 On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
 wrote:

> Matt,
>
> Test 1 nonsmooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
> runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-mata AMAT.dat -matb BMAT.dat \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2 smooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
>
> Thanks,
>
> Wenbo
>
> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao > > wrote:
>>
>>> Mark,
>>>
>>> Thanks for your reply.
>>>
>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>>>
 Please send the output with -st_ksp_view and -st_ksp_monitor and we
 can start to debug it.

 Test 1 with nonsmooth and preonly is OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
>>> runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>>> 2>&1
>>>
>>> Test 2 smooth and preonly is not OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
>>> runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>> Test 3 nonsmooth and gmres is not OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>>
>>
>> DO NOT DO THIS. Please send the output where you do NOTHING to the
>> coarse solver.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>>> makefile:59: recipe for target 'runkr_gmres' failed
>>> make: *** [runkr_gmres] Error 91
>>>
>>> log-files is attached.
>>>
>>>
>>> You mentioned that B is not symmetric. I assume it is elliptic
 (diffusion). Where does t

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
This is normal:

Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations 10

It looks like ksp->errorifnotconverged got set somehow. If the default
changed in KSP then (SAGG) GAMG would not ever work.

I assume you don't have a .petscrc file with more (crazy) options in it ...


On Mon, Oct 2, 2017 at 11:39 AM, Wenbo Zhao 
wrote:

>
>
> On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:
>>
>>> non-smoothed aggregation is converging very fast. smoothed fails in the
>>> eigen estimator.
>>>
>>> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
>>> and see if you get more output (I'm not 100% sure about these args).
>>>
>>
>> I also want -st_gamg_est_ksp_converged_reason
>>
>>   Thanks,
>>
>> Matt
>>
> $make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>-st_gamg_est_ksp_converged_reason \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> Thanks
> Wenbo
>
>
>>
>>
>>>
>>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
>>> wrote:
>>>
 Matt,

 Test 1 nonsmooth
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
 runkr_nonsmooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-mata AMAT.dat -matb BMAT.dat \
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

 Test 2 smooth
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
-mata AMAT.dat -matb BMAT.dat \
-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
 makefile:43: recipe for target 'runkr_smooth' failed
 make: *** [runkr_smooth] Error 91


 Thanks,

 Wenbo

 On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
 wrote:

> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
> wrote:
>
>> Mark,
>>
>> Thanks for your reply.
>>
>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>>
>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we
>>> can start to debug it.
>>>
>>> Test 1 with nonsmooth and preonly is OK
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
>> runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>> 2>&1
>>
>> Test 2 smooth and preonly is not OK
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>> Test 3 nonsmooth and gmres is not OK
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>
>
> DO NOT DO THIS. Please send the output where you do NOTHING to the
> coarse solver.
>
>   Thanks,
>
>  Matt
>
>
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>> makefile:59: recipe for target 'runkr_gmres' failed
>> make: *** [runkr_gmres] Error 91
>>
>> log-files is attached.
>>
>>
>> You mentioned that B is not symmetric. I assume it is elliptic
>>> (diffusion). Where does the asymmetry come from?
>>>
>>>
>> It is a two-group diffusion equations, where group denotes neutron
>> enegry discretisation.
>> Matrix B consists of neutron diffusion/leakage term, removal term and
>> minus neutron scatter 

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley  wrote:

> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:
>
>> non-smoothed aggregation is converging very fast. smoothed fails in the
>> eigen estimator.
>>
>> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
>> and see if you get more output (I'm not 100% sure about these args).
>>
>
> I also want -st_gamg_est_ksp_converged_reason
>
>   Thanks,
>
> Matt
>
$make NCORE=1 runkr_smooth
mpirun -n 1 ./step-41 \
   -st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
   -mata AMAT.dat -matb BMAT.dat \
   -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
   -st_gamg_est_ksp_converged_reason \
   -eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
makefile:43: recipe for target 'runkr_smooth' failed
make: *** [runkr_smooth] Error 91

Thanks
Wenbo


>
>
>>
>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>>
>>> Test 1 nonsmooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>> Test 2 smooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>>
>>> Thanks,
>>>
>>> Wenbo
>>>
>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
>>> wrote:
>>>
 On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
 wrote:

> Mark,
>
> Thanks for your reply.
>
> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>
>> Please send the output with -st_ksp_view and -st_ksp_monitor and we
>> can start to debug it.
>>
>> Test 1 with nonsmooth and preonly is OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
> runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2 smooth and preonly is not OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> Test 3 nonsmooth and gmres is not OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
> -st_mg_coarse_ksp_rtol 1.0e-6 \
>

 DO NOT DO THIS. Please send the output where you do NOTHING to the
 coarse solver.

   Thanks,

  Matt


>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
> makefile:59: recipe for target 'runkr_gmres' failed
> make: *** [runkr_gmres] Error 91
>
> log-files is attached.
>
>
> You mentioned that B is not symmetric. I assume it is elliptic
>> (diffusion). Where does the asymmetry come from?
>>
>>
> It is a two-group diffusion equations, where group denotes neutron
> enegry discretisation.
> Matrix B consists of neutron diffusion/leakage term, removal term and
> minus neutron scatter source term between different energies, when matrix 
> A
> denotes neutron fission source.
>
> Diffusion term(Laplace operator) is elliptic and symmetric. Removal
> term is diagonal only. However scatter term is asymmetry since scatter 
> term
> from high energy to low energy is far greater than the term from low to
> high.
>
>
> Wenbo
>
>
>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>> Thanks for your reply.
>>>

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams  wrote:

> non-smoothed aggregation is converging very fast. smoothed fails in the
> eigen estimator.
>
> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
> and see if you get more output (I'm not 100% sure about these args).
>

I also want -st_gamg_est_ksp_converged_reason

  Thanks,

Matt


>
> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
> wrote:
>
>> Matt,
>>
>> Test 1 nonsmooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>
>> Test 2 smooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>>
>> Thanks,
>>
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
>> wrote:
>>
>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
>>> wrote:
>>>
 Mark,

 Thanks for your reply.

 On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:

> Please send the output with -st_ksp_view and -st_ksp_monitor and we
> can start to debug it.
>
> Test 1 with nonsmooth and preonly is OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
 runkr_nonsmooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

 Test 2 smooth and preonly is not OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
 makefile:43: recipe for target 'runkr_smooth' failed
 make: *** [runkr_smooth] Error 91

 Test 3 nonsmooth and gmres is not OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
 -st_mg_coarse_ksp_rtol 1.0e-6 \

>>>
>>> DO NOT DO THIS. Please send the output where you do NOTHING to the
>>> coarse solver.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
 makefile:59: recipe for target 'runkr_gmres' failed
 make: *** [runkr_gmres] Error 91

 log-files is attached.


 You mentioned that B is not symmetric. I assume it is elliptic
> (diffusion). Where does the asymmetry come from?
>
>
 It is a two-group diffusion equations, where group denotes neutron
 enegry discretisation.
 Matrix B consists of neutron diffusion/leakage term, removal term and
 minus neutron scatter source term between different energies, when matrix A
 denotes neutron fission source.

 Diffusion term(Laplace operator) is elliptic and symmetric. Removal
 term is diagonal only. However scatter term is asymmetry since scatter term
 from high energy to low energy is far greater than the term from low to
 high.


 Wenbo


> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao 
> wrote:
>
>> Matt,
>> Thanks for your reply.
>> For the defalt option doesnt work firstly( -st_ksp_type gmres
>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I 
>> tried
>> to test those options.
>>
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley 
>> wrote:
>>
>>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao >> > wrote:
>>>
 Matt

 Because I am not clear about what will happen using 'preonly' for
 large scale problem.

>>>
>>> The size of the problem has nothing to do with 'preonly'. All it
>>> means is to apply a preconditioner without a Krylov solver.
>>>
>>>
 It s

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
It seems to solve fine but then fails on this:

  ierr = PetscObjectSAWsBlock((PetscObject)ksp);CHKERRQ(ierr);
  if (ksp->errorifnotconverged && ksp->reason < 0)
SETERRQ(comm,PETSC_ERR_NOT_CONVERGED,"KSPSolve has not converged");

It looks like somehow ksp->errorifnotconverged got set.



On Mon, Oct 2, 2017 at 11:23 AM, Wenbo Zhao 
wrote:

> I get more output
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-mata AMAT.dat -matb BMAT.dat \
>-st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> On Mon, Oct 2, 2017 at 11:15 PM, Mark Adams  wrote:
>
>> non-smoothed aggregation is converging very fast. smoothed fails in the
>> eigen estimator.
>>
>> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
>> and see if you get more output (I'm not 100% sure about these args).
>>
>>
>>
>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>>
>>> Test 1 nonsmooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>> Test 2 smooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>>
>>> Thanks,
>>>
>>> Wenbo
>>>
>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
>>> wrote:
>>>
 On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
 wrote:

> Mark,
>
> Thanks for your reply.
>
> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>
>> Please send the output with -st_ksp_view and -st_ksp_monitor and we
>> can start to debug it.
>>
>> Test 1 with nonsmooth and preonly is OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
> runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2 smooth and preonly is not OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> Test 3 nonsmooth and gmres is not OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
> -st_mg_coarse_ksp_rtol 1.0e-6 \
>

 DO NOT DO THIS. Please send the output where you do NOTHING to the
 coarse solver.

   Thanks,

  Matt


>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
> makefile:59: recipe for target 'runkr_gmres' failed
> make: *** [runkr_gmres] Error 91
>
> log-files is attached.
>
>
> You mentioned that B is not symmetric. I assume it is elliptic
>> (diffusion). Where does the asymmetry come from?
>>
>>
> It is a two-group diffusion equations, where group denotes neutron
> enegry discre

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 11:21 AM, Mark Adams  wrote:

> Yea, it fails in the eigen estimator, but the Cheby eigen estimator works
> in the solve that works:
>
> eigenvalue estimates used:  min = 0.14, max = 1.10004
> eigenvalues estimate via gmres min 0.0118548, max 1.4
>
> Why would it just give "KSPSolve has not converged". It is not supposed to
> converge ...
>

This sounds like a mistake with


http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPSetErrorIfNotConverged.html

somewhere.

   Matt


> On Mon, Oct 2, 2017 at 11:11 AM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>>
>>
>> Thanks Wenbo.
>>
>>
>>> Test 1 nonsmooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>> Test 2 smooth
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-mata AMAT.dat -matb BMAT.dat \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>>
>> Mark, the solve is not failing, its the construction of the interpolator
>> I think. Check out this stack
>>
>> [0]PETSC ERROR: - Error Message
>> --
>> [0]PETSC ERROR:
>> [0]PETSC ERROR: KSPSolve has not converged
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown
>> [0]PETSC ERROR: ./step-41 on a arch-linux2-c-debug named ubuntu by
>> zhaowenbo Mon Oct  2 08:00:58 2017
>> [0]PETSC ERROR: Configure options --with-mpi=1 --with-shared-libraries=1
>> --with-64-bit-indices=1 --with-debugging=1
>> [0]PETSC ERROR: #1 KSPSolve() line 855 in /home/zhaowenbo/research/petsc
>> /petsc_git/src/ksp/ksp/interface/itfunc.c
>> [0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1186 in
>> /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/agg.c
>> [0]PETSC ERROR: #3 PCSetUp_GAMG() line 528 in
>> /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/gamg.c
>> [0]PETSC ERROR: #4 PCSetUp() line 924 in /home/zhaowenbo/research/petsc
>> /petsc_git/src/ksp/pc/interface/precon.c
>> [0]PETSC ERROR: #5 KSPSetUp() line 378 in /home/zhaowenbo/research/petsc
>> /petsc_git/src/ksp/ksp/interface/itfunc.c
>> [0]PETSC ERROR: #6 STSetUp_Shift() line 129 in
>> /home/zhaowenbo/research/slepc/slepc_git/src/sys/classes/st/
>> impls/shift/shift.c
>> [0]PETSC ERROR: #7 STSetUp() line 281 in /home/zhaowenbo/research/slepc
>> /slepc_git/src/sys/classes/st/interface/stsolve.c
>> [0]PETSC ERROR: #8 EPSSetUp() line 273 in /home/zhaowenbo/research/slepc
>> /slepc_git/src/eps/interface/epssetup.c
>> [0]PETSC ERROR: #9 solve_diffusion_3d() line 1029 in src/diffu.c
>> [0]PETSC ERROR: #10 main() line 25 in src/main.c
>> [0]PETSC ERROR: PETSc Option Table entries:
>> [0]PETSC ERROR: -eps_monitor
>> [0]PETSC ERROR: -eps_ncv 10
>> [0]PETSC ERROR: -eps_nev 1
>> [0]PETSC ERROR: -log_view
>> [0]PETSC ERROR: -mata AMAT.dat
>> [0]PETSC ERROR: -matb BMAT.dat
>> [0]PETSC ERROR: -st_ksp_monitor
>> [0]PETSC ERROR: -st_ksp_type gmres
>> [0]PETSC ERROR: -st_ksp_view
>> [0]PETSC ERROR: -st_pc_gamg_agg_nsmooths 1
>> [0]PETSC ERROR: -st_pc_gamg_type agg
>> [0]PETSC ERROR: -st_pc_type gamg
>> [0]PETSC ERROR: End of Error Message ---send entire
>> error message to petsc-ma...@mcs.anl.gov--
>> 
>> --
>> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>> with errorcode 91.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Thanks,
>>>
>>> Wenbo
>>>
>>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
>>> wrote:
>>>
 On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
 wrote:

> Mark,
>
> Thanks for your reply.
>
> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>
>> Please send the output with -st_ksp_view and -st_ksp_monitor and we
>> can start to debug it.
>>
>> Test 1 with nonsmooth and preonly is OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
> runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_n

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
I get more output
zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
mpirun -n 1 ./step-41 \
   -st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
   -mata AMAT.dat -matb BMAT.dat \
   -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
   -eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
makefile:43: recipe for target 'runkr_smooth' failed
make: *** [runkr_smooth] Error 91

zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
mpirun -n 1 ./step-41 \
   -st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
   -mata AMAT.dat -matb BMAT.dat \
   -st_gamg_est_ksp_view -st_gamg_est_ksp_monitor \
   -eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

On Mon, Oct 2, 2017 at 11:15 PM, Mark Adams  wrote:

> non-smoothed aggregation is converging very fast. smoothed fails in the
> eigen estimator.
>
> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
> and see if you get more output (I'm not 100% sure about these args).
>
>
>
> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
> wrote:
>
>> Matt,
>>
>> Test 1 nonsmooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>
>> Test 2 smooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>>
>> Thanks,
>>
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
>> wrote:
>>
>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
>>> wrote:
>>>
 Mark,

 Thanks for your reply.

 On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:

> Please send the output with -st_ksp_view and -st_ksp_monitor and we
> can start to debug it.
>
> Test 1 with nonsmooth and preonly is OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
 runkr_nonsmooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

 Test 2 smooth and preonly is not OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
 makefile:43: recipe for target 'runkr_smooth' failed
 make: *** [runkr_smooth] Error 91

 Test 3 nonsmooth and gmres is not OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
 -st_mg_coarse_ksp_rtol 1.0e-6 \

>>>
>>> DO NOT DO THIS. Please send the output where you do NOTHING to the
>>> coarse solver.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
 makefile:59: recipe for target 'runkr_gmres' failed
 make: *** [runkr_gmres] Error 91

 log-files is attached.


 You mentioned that B is not symmetric. I assume it is elliptic
> (diffusion). Where does the asymmetry come from?
>
>
 It is a two-group diffusion equations, where group denotes neutron
 enegry discretisation.
 Matrix B consists of neutron diffusion/leakage term, removal term and
 minus neutron scatter source term between different energies, when matrix A
 denotes neutron fission source.

 Diffusion term(Laplace operator) is elliptic and symmetric. Removal
 term is diagonal only. However scatter term is asymmetry since scatter term
 from high energy to low energy is far greater than the term from low to
 high.


 Wenbo



Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
Yea, it fails in the eigen estimator, but the Cheby eigen estimator works
in the solve that works:

eigenvalue estimates used:  min = 0.14, max = 1.10004
eigenvalues estimate via gmres min 0.0118548, max 1.4

Why would it just give "KSPSolve has not converged". It is not supposed to
converge ...



On Mon, Oct 2, 2017 at 11:11 AM, Matthew Knepley  wrote:

> On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
> wrote:
>
>> Matt,
>>
>
> Thanks Wenbo.
>
>
>> Test 1 nonsmooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>
>> Test 2 smooth
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-mata AMAT.dat -matb BMAT.dat \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>>
> Mark, the solve is not failing, its the construction of the interpolator I
> think. Check out this stack
>
> [0]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR:
> [0]PETSC ERROR: KSPSolve has not converged
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.8.0, unknown
> [0]PETSC ERROR: ./step-41 on a arch-linux2-c-debug named ubuntu by
> zhaowenbo Mon Oct  2 08:00:58 2017
> [0]PETSC ERROR: Configure options --with-mpi=1 --with-shared-libraries=1
> --with-64-bit-indices=1 --with-debugging=1
> [0]PETSC ERROR: #1 KSPSolve() line 855 in /home/zhaowenbo/research/
> petsc/petsc_git/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1186 in
> /home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/agg.c
> [0]PETSC ERROR: #3 PCSetUp_GAMG() line 528 in /home/zhaowenbo/research/
> petsc/petsc_git/src/ksp/pc/impls/gamg/gamg.c
> [0]PETSC ERROR: #4 PCSetUp() line 924 in /home/zhaowenbo/research/
> petsc/petsc_git/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #5 KSPSetUp() line 378 in /home/zhaowenbo/research/
> petsc/petsc_git/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #6 STSetUp_Shift() line 129 in /home/zhaowenbo/research/
> slepc/slepc_git/src/sys/classes/st/impls/shift/shift.c
> [0]PETSC ERROR: #7 STSetUp() line 281 in /home/zhaowenbo/research/
> slepc/slepc_git/src/sys/classes/st/interface/stsolve.c
> [0]PETSC ERROR: #8 EPSSetUp() line 273 in /home/zhaowenbo/research/
> slepc/slepc_git/src/eps/interface/epssetup.c
> [0]PETSC ERROR: #9 solve_diffusion_3d() line 1029 in src/diffu.c
> [0]PETSC ERROR: #10 main() line 25 in src/main.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -eps_monitor
> [0]PETSC ERROR: -eps_ncv 10
> [0]PETSC ERROR: -eps_nev 1
> [0]PETSC ERROR: -log_view
> [0]PETSC ERROR: -mata AMAT.dat
> [0]PETSC ERROR: -matb BMAT.dat
> [0]PETSC ERROR: -st_ksp_monitor
> [0]PETSC ERROR: -st_ksp_type gmres
> [0]PETSC ERROR: -st_ksp_view
> [0]PETSC ERROR: -st_pc_gamg_agg_nsmooths 1
> [0]PETSC ERROR: -st_pc_gamg_type agg
> [0]PETSC ERROR: -st_pc_type gamg
> [0]PETSC ERROR: End of Error Message ---send entire
> error message to petsc-ma...@mcs.anl.gov--
> --
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 91.
>
>   Thanks,
>
>  Matt
>
>
>> Thanks,
>>
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
>> wrote:
>>
>>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
>>> wrote:
>>>
 Mark,

 Thanks for your reply.

 On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:

> Please send the output with -st_ksp_view and -st_ksp_monitor and we
> can start to debug it.
>
> Test 1 with nonsmooth and preonly is OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1
 runkr_nonsmooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

 Test 2 smooth and preonly is not OK
 zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
   

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
non-smoothed aggregation is converging very fast. smoothed fails in the
eigen estimator.

Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, and
see if you get more output (I'm not 100% sure about these args).



On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
wrote:

> Matt,
>
> Test 1 nonsmooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-mata AMAT.dat -matb BMAT.dat \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2 smooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
>
> Thanks,
>
> Wenbo
>
> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
>> wrote:
>>
>>> Mark,
>>>
>>> Thanks for your reply.
>>>
>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>>>
 Please send the output with -st_ksp_view and -st_ksp_monitor and we can
 start to debug it.

 Test 1 with nonsmooth and preonly is OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>> Test 2 smooth and preonly is not OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>> Test 3 nonsmooth and gmres is not OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>>
>>
>> DO NOT DO THIS. Please send the output where you do NOTHING to the coarse
>> solver.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>>> makefile:59: recipe for target 'runkr_gmres' failed
>>> make: *** [runkr_gmres] Error 91
>>>
>>> log-files is attached.
>>>
>>>
>>> You mentioned that B is not symmetric. I assume it is elliptic
 (diffusion). Where does the asymmetry come from?


>>> It is a two-group diffusion equations, where group denotes neutron
>>> enegry discretisation.
>>> Matrix B consists of neutron diffusion/leakage term, removal term and
>>> minus neutron scatter source term between different energies, when matrix A
>>> denotes neutron fission source.
>>>
>>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal term
>>> is diagonal only. However scatter term is asymmetry since scatter term from
>>> high energy to low energy is far greater than the term from low to high.
>>>
>>>
>>> Wenbo
>>>
>>>
 On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao 
 wrote:

> Matt,
> Thanks for your reply.
> For the defalt option doesnt work firstly( -st_ksp_type gmres
> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried
> to test those options.
>
> Wenbo
>
> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt
>>>
>>> Because I am not clear about what will happen using 'preonly' for
>>> large scale problem.
>>>
>>
>> The size of the problem has nothing to do with 'preonly'. All it
>> means is to apply a preconditioner without a Krylov solver.
>>
>>
>>> It seems to use a direct solver from below,
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
>>> KSP/KSPPREONLY.html
>>>
>>
>> However, I still cannot understand why you would change the default?
>>
>>   Matt
>>
>>
>>>
>>> Thanks!

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao 
wrote:

> Matt,
>

Thanks Wenbo.


> Test 1 nonsmooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-mata AMAT.dat -matb BMAT.dat \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2 smooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-mata AMAT.dat -matb BMAT.dat \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
>
Mark, the solve is not failing, its the construction of the interpolator I
think. Check out this stack

[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR:
[0]PETSC ERROR: KSPSolve has not converged
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.0, unknown
[0]PETSC ERROR: ./step-41 on a arch-linux2-c-debug named ubuntu by
zhaowenbo Mon Oct  2 08:00:58 2017
[0]PETSC ERROR: Configure options --with-mpi=1 --with-shared-libraries=1
--with-64-bit-indices=1 --with-debugging=1
[0]PETSC ERROR: #1 KSPSolve() line 855 in
/home/zhaowenbo/research/petsc/petsc_git/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1186 in
/home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/agg.c
[0]PETSC ERROR: #3 PCSetUp_GAMG() line 528 in
/home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/impls/gamg/gamg.c
[0]PETSC ERROR: #4 PCSetUp() line 924 in
/home/zhaowenbo/research/petsc/petsc_git/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #5 KSPSetUp() line 378 in
/home/zhaowenbo/research/petsc/petsc_git/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #6 STSetUp_Shift() line 129 in
/home/zhaowenbo/research/slepc/slepc_git/src/sys/classes/st/impls/shift/shift.c
[0]PETSC ERROR: #7 STSetUp() line 281 in
/home/zhaowenbo/research/slepc/slepc_git/src/sys/classes/st/interface/stsolve.c
[0]PETSC ERROR: #8 EPSSetUp() line 273 in
/home/zhaowenbo/research/slepc/slepc_git/src/eps/interface/epssetup.c
[0]PETSC ERROR: #9 solve_diffusion_3d() line 1029 in src/diffu.c
[0]PETSC ERROR: #10 main() line 25 in src/main.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -eps_monitor
[0]PETSC ERROR: -eps_ncv 10
[0]PETSC ERROR: -eps_nev 1
[0]PETSC ERROR: -log_view
[0]PETSC ERROR: -mata AMAT.dat
[0]PETSC ERROR: -matb BMAT.dat
[0]PETSC ERROR: -st_ksp_monitor
[0]PETSC ERROR: -st_ksp_type gmres
[0]PETSC ERROR: -st_ksp_view
[0]PETSC ERROR: -st_pc_gamg_agg_nsmooths 1
[0]PETSC ERROR: -st_pc_gamg_type agg
[0]PETSC ERROR: -st_pc_type gamg
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--
--
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 91.

  Thanks,

 Matt


> Thanks,
>
> Wenbo
>
> On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley 
> wrote:
>
>> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
>> wrote:
>>
>>> Mark,
>>>
>>> Thanks for your reply.
>>>
>>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>>>
 Please send the output with -st_ksp_view and -st_ksp_monitor and we can
 start to debug it.

 Test 1 with nonsmooth and preonly is OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>> Test 2 smooth and preonly is not OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>> Test 3 nonsmooth and gmres is not OK
>>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>  

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
Matt,

Test 1 nonsmooth
zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
mpirun -n 1 ./step-41 \
   -st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
   -mata AMAT.dat -matb BMAT.dat \
   -eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

Test 2 smooth
zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
mpirun -n 1 ./step-41 \
   -st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
   -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
   -mata AMAT.dat -matb BMAT.dat \
   -eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
makefile:43: recipe for target 'runkr_smooth' failed
make: *** [runkr_smooth] Error 91


Thanks,

Wenbo

On Mon, Oct 2, 2017 at 10:48 PM, Matthew Knepley  wrote:

> On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
> wrote:
>
>> Mark,
>>
>> Thanks for your reply.
>>
>> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>>
>>> Please send the output with -st_ksp_view and -st_ksp_monitor and we can
>>> start to debug it.
>>>
>>> Test 1 with nonsmooth and preonly is OK
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>
>> Test 2 smooth and preonly is not OK
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>> Test 3 nonsmooth and gmres is not OK
>> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>
>
> DO NOT DO THIS. Please send the output where you do NOTHING to the coarse
> solver.
>
>   Thanks,
>
>  Matt
>
>
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>> makefile:59: recipe for target 'runkr_gmres' failed
>> make: *** [runkr_gmres] Error 91
>>
>> log-files is attached.
>>
>>
>> You mentioned that B is not symmetric. I assume it is elliptic
>>> (diffusion). Where does the asymmetry come from?
>>>
>>>
>> It is a two-group diffusion equations, where group denotes neutron enegry
>> discretisation.
>> Matrix B consists of neutron diffusion/leakage term, removal term and
>> minus neutron scatter source term between different energies, when matrix A
>> denotes neutron fission source.
>>
>> Diffusion term(Laplace operator) is elliptic and symmetric. Removal term
>> is diagonal only. However scatter term is asymmetry since scatter term from
>> high energy to low energy is far greater than the term from low to high.
>>
>>
>> Wenbo
>>
>>
>>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao 
>>> wrote:
>>>
 Matt,
 Thanks for your reply.
 For the defalt option doesnt work firstly( -st_ksp_type gmres
 -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried
 to test those options.

 Wenbo

 On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley 
 wrote:

> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao 
> wrote:
>
>> Matt
>>
>> Because I am not clear about what will happen using 'preonly' for
>> large scale problem.
>>
>
> The size of the problem has nothing to do with 'preonly'. All it means
> is to apply a preconditioner without a Krylov solver.
>
>
>> It seems to use a direct solver from below,
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
>> KSP/KSPPREONLY.html
>>
>
> However, I still cannot understand why you would change the default?
>
>   Matt
>
>
>>
>> Thanks!
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley 
>> wrote:
>>
>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao >> > wrote:
>>>
 Matt,
 Thanks for your reply.
 It DOES make no sense for this problem.
 But I am not clear about the 'preonly' option. Which solver is used
 in preonly? I wonder if 'preonly' is suitable for large scale problem 
 such
 as

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao 
wrote:

> Mark,
>
> Thanks for your reply.
>
> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams  wrote:
>
>> Please send the output with -st_ksp_view and -st_ksp_monitor and we can
>> start to debug it.
>>
>> Test 1 with nonsmooth and preonly is OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2 smooth and preonly is not OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> Test 3 nonsmooth and gmres is not OK
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_gmres
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  -st_ksp_view -st_ksp_monitor  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
> -st_mg_coarse_ksp_rtol 1.0e-6 \
>

DO NOT DO THIS. Please send the output where you do NOTHING to the coarse
solver.

  Thanks,

 Matt


>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
> makefile:59: recipe for target 'runkr_gmres' failed
> make: *** [runkr_gmres] Error 91
>
> log-files is attached.
>
>
> You mentioned that B is not symmetric. I assume it is elliptic
>> (diffusion). Where does the asymmetry come from?
>>
>>
> It is a two-group diffusion equations, where group denotes neutron enegry
> discretisation.
> Matrix B consists of neutron diffusion/leakage term, removal term and
> minus neutron scatter source term between different energies, when matrix A
> denotes neutron fission source.
>
> Diffusion term(Laplace operator) is elliptic and symmetric. Removal term
> is diagonal only. However scatter term is asymmetry since scatter term from
> high energy to low energy is far greater than the term from low to high.
>
>
> Wenbo
>
>
>> On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>> Thanks for your reply.
>>> For the defalt option doesnt work firstly( -st_ksp_type gmres
>>> -st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried
>>> to test those options.
>>>
>>> Wenbo
>>>
>>> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley 
>>> wrote:
>>>
 On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao 
 wrote:

> Matt
>
> Because I am not clear about what will happen using 'preonly' for
> large scale problem.
>

 The size of the problem has nothing to do with 'preonly'. All it means
 is to apply a preconditioner without a Krylov solver.


> It seems to use a direct solver from below,
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
> KSP/KSPPREONLY.html
>

 However, I still cannot understand why you would change the default?

   Matt


>
> Thanks!
> Wenbo
>
> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley 
> wrote:
>
>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>> Thanks for your reply.
>>> It DOES make no sense for this problem.
>>> But I am not clear about the 'preonly' option. Which solver is used
>>> in preonly? I wonder if 'preonly' is suitable for large scale problem 
>>> such
>>> as 400,000,000 unknowns.
>>> So I tried 'gmres' option and found these error messages.
>>>
>>
>> I mean, why are you setting this at all. Just do not set the coarse
>> solver. The default should work fine.
>>
>>   Thanks,
>>
>> Matt
>>
>>
>>> Could you give me some suggestions?
>>>
>>> Thanks.
>>>
>>> Wenbo
>>>
>>>
>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
>>> wrote:
>>>
 On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao <
 zhaowenbo.n...@gmail.com> wrote:

> Hi,
>
> I met some questions when I use PETSC/SLEPC to solve two-group
> neutron diffusion equations with finite difference method. The grid is
> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54.
> It is generalized eigenvalue problem Ax=\lamda Bx, where B is
> diagonally dominant matrix but not symmetry.
> EPS is set as bel

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
Please send the output with -st_ksp_view and -st_ksp_monitor and we can
start to debug it.

You mentioned that B is not symmetric. I assume it is elliptic (diffusion).
Where does the asymmetry come from?

On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao  wrote:

> Matt,
> Thanks for your reply.
> For the defalt option doesnt work firstly( -st_ksp_type gmres -st_pc_type
> gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried to test
> those options.
>
> Wenbo
>
> On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley  wrote:
>
>> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao 
>> wrote:
>>
>>> Matt
>>>
>>> Because I am not clear about what will happen using 'preonly' for large
>>> scale problem.
>>>
>>
>> The size of the problem has nothing to do with 'preonly'. All it means is
>> to apply a preconditioner without a Krylov solver.
>>
>>
>>> It seems to use a direct solver from below,
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
>>> KSP/KSPPREONLY.html
>>>
>>
>> However, I still cannot understand why you would change the default?
>>
>>   Matt
>>
>>
>>>
>>> Thanks!
>>> Wenbo
>>>
>>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley 
>>> wrote:
>>>
 On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao 
 wrote:

> Matt,
> Thanks for your reply.
> It DOES make no sense for this problem.
> But I am not clear about the 'preonly' option. Which solver is used in
> preonly? I wonder if 'preonly' is suitable for large scale problem such as
> 400,000,000 unknowns.
> So I tried 'gmres' option and found these error messages.
>

 I mean, why are you setting this at all. Just do not set the coarse
 solver. The default should work fine.

   Thanks,

 Matt


> Could you give me some suggestions?
>
> Thanks.
>
> Wenbo
>
>
> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
> wrote:
>
>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao 
>> wrote:
>>
>>> Hi,
>>>
>>> I met some questions when I use PETSC/SLEPC to solve two-group
>>> neutron diffusion equations with finite difference method. The grid is
>>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54.
>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is
>>> diagonally dominant matrix but not symmetry.
>>> EPS is set as below,
>>>  ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);¬
>>>  ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);¬
>>>
>>> Krylovschur is used as eps sovler. GAMG is used as PC.
>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and
>>> preonly is OK.
>>>
>>
>> Why are you setting the coarse solver. This makes no sense.
>>
>>Thanks,
>>
>> Matt
>>
>>
>>>
>>> Test 1
>>> $ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>>> 2>&1
>>>
>>> Test 2
>>> $ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>> Test 3
>>> $ make NCORE=1 runkr_gmres
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>>> makefile:59: recipe for target 'runkr_gmres' failed
>>> make: *** [runkr_gmres] Error 91
>>>
>>> Log files were attched.
>>> The matrix file were also attched as AMAT.dat and BMAT.dat.
>>>
>>> Is it correct? Or something wrong with my code or commad-line?
>>>
>>> Thanks!
>>>
>>> Wenbo
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which 
>> their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>
>
>


 --
 What most experimenters take for granted before they begin the

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
Matt,
Thanks for your reply.
For the defalt option doesnt work firstly( -st_ksp_type gmres -st_pc_type
gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried to test
those options.

Wenbo

On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley  wrote:

> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao 
> wrote:
>
>> Matt
>>
>> Because I am not clear about what will happen using 'preonly' for large
>> scale problem.
>>
>
> The size of the problem has nothing to do with 'preonly'. All it means is
> to apply a preconditioner without a Krylov solver.
>
>
>> It seems to use a direct solver from below,
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
>> KSP/KSPPREONLY.html
>>
>
> However, I still cannot understand why you would change the default?
>
>   Matt
>
>
>>
>> Thanks!
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley 
>> wrote:
>>
>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao 
>>> wrote:
>>>
 Matt,
 Thanks for your reply.
 It DOES make no sense for this problem.
 But I am not clear about the 'preonly' option. Which solver is used in
 preonly? I wonder if 'preonly' is suitable for large scale problem such as
 400,000,000 unknowns.
 So I tried 'gmres' option and found these error messages.

>>>
>>> I mean, why are you setting this at all. Just do not set the coarse
>>> solver. The default should work fine.
>>>
>>>   Thanks,
>>>
>>> Matt
>>>
>>>
 Could you give me some suggestions?

 Thanks.

 Wenbo


 On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
 wrote:

> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao 
> wrote:
>
>> Hi,
>>
>> I met some questions when I use PETSC/SLEPC to solve two-group
>> neutron diffusion equations with finite difference method. The grid is
>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54.
>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is
>> diagonally dominant matrix but not symmetry.
>> EPS is set as below,
>>  ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);¬
>>  ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);¬
>>
>> Krylovschur is used as eps sovler. GAMG is used as PC.
>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and
>> preonly is OK.
>>
>
> Why are you setting the coarse solver. This makes no sense.
>
>Thanks,
>
> Matt
>
>
>>
>> Test 1
>> $ make NCORE=1 runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>> 2>&1
>>
>> Test 2
>> $ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>> Test 3
>> $ make NCORE=1 runkr_gmres
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>> makefile:59: recipe for target 'runkr_gmres' failed
>> make: *** [runkr_gmres] Error 91
>>
>> Log files were attched.
>> The matrix file were also attched as AMAT.dat and BMAT.dat.
>>
>> Is it correct? Or something wrong with my code or commad-line?
>>
>> Thanks!
>>
>> Wenbo
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> 
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://ww

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Mark Adams
GAMG will coarsen the problem until it is small and fast to solve with a
direct solver (LU). You can use preonly if you have a perfect
preconditioner.

On Mon, Oct 2, 2017 at 9:08 AM, Matthew Knepley  wrote:

> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao 
> wrote:
>
>> Matt
>>
>> Because I am not clear about what will happen using 'preonly' for large
>> scale problem.
>>
>
> The size of the problem has nothing to do with 'preonly'. All it means is
> to apply a preconditioner without a Krylov solver.
>
>
>> It seems to use a direct solver from below,
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/
>> KSP/KSPPREONLY.html
>>
>
> However, I still cannot understand why you would change the default?
>
>   Matt
>
>
>>
>> Thanks!
>> Wenbo
>>
>> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley 
>> wrote:
>>
>>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao 
>>> wrote:
>>>
 Matt,
 Thanks for your reply.
 It DOES make no sense for this problem.
 But I am not clear about the 'preonly' option. Which solver is used in
 preonly? I wonder if 'preonly' is suitable for large scale problem such as
 400,000,000 unknowns.
 So I tried 'gmres' option and found these error messages.

>>>
>>> I mean, why are you setting this at all. Just do not set the coarse
>>> solver. The default should work fine.
>>>
>>>   Thanks,
>>>
>>> Matt
>>>
>>>
 Could you give me some suggestions?

 Thanks.

 Wenbo


 On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
 wrote:

> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao 
> wrote:
>
>> Hi,
>>
>> I met some questions when I use PETSC/SLEPC to solve two-group
>> neutron diffusion equations with finite difference method. The grid is
>> 3*3*3, when DOF on each points is 2. So the matrix size is 54*54.
>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is
>> diagonally dominant matrix but not symmetry.
>> EPS is set as below,
>>  ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);¬
>>  ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);¬
>>
>> Krylovschur is used as eps sovler. GAMG is used as PC.
>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and
>> preonly is OK.
>>
>
> Why are you setting the coarse solver. This makes no sense.
>
>Thanks,
>
> Matt
>
>
>>
>> Test 1
>> $ make NCORE=1 runkr_nonsmooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth
>> 2>&1
>>
>> Test 2
>> $ make NCORE=1 runkr_smooth
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>> makefile:43: recipe for target 'runkr_smooth' failed
>> make: *** [runkr_smooth] Error 91
>>
>> Test 3
>> $ make NCORE=1 runkr_gmres
>> mpirun -n 1 ./step-41 \
>>-st_ksp_type gmres  \
>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>> makefile:59: recipe for target 'runkr_gmres' failed
>> make: *** [runkr_gmres] Error 91
>>
>> Log files were attched.
>> The matrix file were also attched as AMAT.dat and BMAT.dat.
>>
>> Is it correct? Or something wrong with my code or commad-line?
>>
>> Thanks!
>>
>> Wenbo
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/ 
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 

Re: [petsc-users] Load distributed matrices from directory

2017-10-02 Thread Barry Smith

  MPCs?

  If you have a collection of "overlapping matrices" on disk then you will be 
responsible for even providing the matrix-vector product for the operator which 
you absolutely need if you are going to use any Krylov based overlapping 
Schwarz method. How do you plan to perform the matrix vector products?

   With regard to applying the preconditioner that is more straightforward. 
Each process gets is "overlapped" matrix and all you need to do is provide the 
VecScatter from a global vector to the "overlapped vector", do the local 
MatMult() with your "overlapped matrix" and then scatter-add back. 

  I question if these entire process is even worth your time. Note: I am not a 
fan of the "custom application code"
 for a "custom" domain decomposition method (obviously or I would never have 
designed PETSc ;). I believe in general purpose library that you can then 
"customize" for your unique problem once you've determined by profiling that 
the customization/optimization is even worth it. For example, I would just 
start with PCASM then if I determine that it is not selecting good subdomains 
you can go and add customization to provide more information to it of how you 
want the subdomains to be defined etc. Some possibly useful routines for 
customization

PETSC_EXTERN PetscErrorCode PCASMSetLocalSubdomains(PC,PetscInt,IS[],IS[]);
PETSC_EXTERN PetscErrorCode PCASMSetTotalSubdomains(PC,PetscInt,IS[],IS[]);
PETSC_EXTERN PetscErrorCode PCASMSetOverlap(PC,PetscInt);
PETSC_EXTERN PetscErrorCode PCASMSetDMSubdomains(PC,PetscBool);
PETSC_EXTERN PetscErrorCode PCASMGetDMSubdomains(PC,PetscBool*);
PETSC_EXTERN PetscErrorCode PCASMSetSortIndices(PC,PetscBool);

PETSC_EXTERN PetscErrorCode PCASMSetType(PC,PCASMType);
PETSC_EXTERN PetscErrorCode PCASMGetType(PC,PCASMType*);
PETSC_EXTERN PetscErrorCode PCASMSetLocalType(PC,PCCompositeType);
PETSC_EXTERN PetscErrorCode PCASMGetLocalType(PC,PCCompositeType*);
PETSC_EXTERN PetscErrorCode PCASMCreateSubdomains(Mat,PetscInt,IS*[]);
PETSC_EXTERN PetscErrorCode PCASMDestroySubdomains(PetscInt,IS[],IS[]);
PETSC_EXTERN PetscErrorCode 
PCASMCreateSubdomains2D(PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt*,IS**,IS**);
PETSC_EXTERN PetscErrorCode PCASMGetLocalSubdomains(PC,PetscInt*,IS*[],IS*[]);
PETSC_EXTERN PetscErrorCode PCASMGetLocalSubmatrices(PC,PetscInt*,Mat*[]);
PETSC_EXTERN PetscErrorCode PCASMGetSubMatType(PC,MatType*);
PETSC_EXTERN PetscErrorCode PCASMSetSubMatType(PC,MatType);

If these are not useful you could tell us what kind of customization you want 
to have within KSP/PCASM and depending on how generally useful it might be we 
could possibly add more hooks for you.

  Barry


> On Oct 2, 2017, at 10:12 AM, Matthieu Vitse  wrote:
> 
> 
>> Le 29 sept. 2017 à 17:43, Barry Smith  a écrit :
>> 
>>  Or is your matrix generator code sequential and cannot generate the full 
>> matrix so you want to generate chunks at a time and save to disk then load 
>> them? Better for you to refactor your code to work in parallel in generating 
>> the whole thing (since you can already generate parts the refactoring 
>> shouldn't be terribly difficult).
> 
> Thanks for your answer. 
> 
> The matrix is already generated in parallel, but we want to keep control on 
> the decomposition which conflicts with directly using PCASM. That’s why we 
> would really like to work only with the distributed matrices. Are there some 
> issues that would prevent me from doing that ? Moreover, ASM is a first step, 
> we would like then to use those matrices for multi-preconditioning our 
> problem, and take into account MPCs (as a consequence we really need to know 
> the decomposition). 
> 
> Thanks, 
> 
> — 
> Matt



Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao  wrote:

> Matt
>
> Because I am not clear about what will happen using 'preonly' for large
> scale problem.
>

The size of the problem has nothing to do with 'preonly'. All it means is
to apply a preconditioner without a Krylov solver.


> It seems to use a direct solver from below,
> http://www.mcs.anl.gov/petsc/petsc-current/docs/
> manualpages/KSP/KSPPREONLY.html
>

However, I still cannot understand why you would change the default?

  Matt


>
> Thanks!
> Wenbo
>
> On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley  wrote:
>
>> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao 
>> wrote:
>>
>>> Matt,
>>> Thanks for your reply.
>>> It DOES make no sense for this problem.
>>> But I am not clear about the 'preonly' option. Which solver is used in
>>> preonly? I wonder if 'preonly' is suitable for large scale problem such as
>>> 400,000,000 unknowns.
>>> So I tried 'gmres' option and found these error messages.
>>>
>>
>> I mean, why are you setting this at all. Just do not set the coarse
>> solver. The default should work fine.
>>
>>   Thanks,
>>
>> Matt
>>
>>
>>> Could you give me some suggestions?
>>>
>>> Thanks.
>>>
>>> Wenbo
>>>
>>>
>>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
>>> wrote:
>>>
 On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao 
 wrote:

> Hi,
>
> I met some questions when I use PETSC/SLEPC to solve two-group neutron
> diffusion equations with finite difference method. The grid is 3*3*3, when
> DOF on each points is 2. So the matrix size is 54*54.
> It is generalized eigenvalue problem Ax=\lamda Bx, where B is
> diagonally dominant matrix but not symmetry.
> EPS is set as below,
>  ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);¬
>  ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);¬
>
> Krylovschur is used as eps sovler. GAMG is used as PC.
> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and
> preonly is OK.
>

 Why are you setting the coarse solver. This makes no sense.

Thanks,

 Matt


>
> Test 1
> $ make NCORE=1 runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>
> Test 2
> $ make NCORE=1 runkr_smooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
> makefile:43: recipe for target 'runkr_smooth' failed
> make: *** [runkr_smooth] Error 91
>
> Test 3
> $ make NCORE=1 runkr_gmres
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres  \
>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
> -st_mg_coarse_ksp_rtol 1.0e-6 \
>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
> makefile:59: recipe for target 'runkr_gmres' failed
> make: *** [runkr_gmres] Error 91
>
> Log files were attched.
> The matrix file were also attched as AMAT.dat and BMAT.dat.
>
> Is it correct? Or something wrong with my code or commad-line?
>
> Thanks!
>
> Wenbo
>



 --
 What most experimenters take for granted before they begin their
 experiments is infinitely more interesting than any results to which their
 experiments lead.
 -- Norbert Wiener

 https://www.cse.buffalo.edu/~knepley/ 

>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ 
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Mat/Vec with empty ranks

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 6:21 AM, Florian Lindner  wrote:

> Hello,
>
> I have a matrix and vector that live on 4 ranks, but only rank 2 and 3
> have values:
>
> e.g.
>
> Vec Object: 4 MPI processes
>   type: mpi
> Process [0]
> Process [1]
> 1.1
> 2.5
> 3.
> 4.
> Process [2]
> 5.
> 6.
> 7.
> 8.
> Process [3]
>
>
> Doing a simple LSQR solve does not converge. However, when the values are
> distributed equally, it converges within 3
> iterations.
>
> What can I do about that?
>
> I have attached a simple program and creates the matrix and vector or
> loads them from a file.
>

There are a few problems with this program. I am attaching a cleaned up
version. However, convergence still differs starting
at iteration 2. It appears that LSQR has a problem with this system, or we
have a bug that I cannot see.

  Thanks,

Matt


> Thanks,
> Florian
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 
#include 


// Mat Q:
// 1.e+00 0.e+00 0.e+00 
// 1.e+00 0.e+00 1.e+00 
// 1.e+00 1.e+00 0.e+00 
// 1.e+00 1.e+00 1.e+00

// 1.e+00 2.e+00 0.e+00 
// 1.e+00 2.e+00 1.e+00 
// 1.e+00 3.e+00 0.e+00 
// 1.e+00 3.e+00 1.e+00 

PetscErrorCode fill(Mat m, Vec v) {
  PetscInt   idxn[3] = {0, 1, 2};
  PetscInt   localRows = 0;
  PetscMPIIntrank;
  PetscErrorCode ierr;

  PetscFunctionBegin;
  ierr = MPI_Comm_rank(MPI_COMM_WORLD, &rank);CHKERRQ(ierr);

  if (rank == 1 || rank == 2) localRows = 4;
  ierr = MatSetSizes(m, localRows, PETSC_DECIDE, PETSC_DECIDE, 3);CHKERRQ(ierr);
  ierr = VecSetSizes(v, localRows, PETSC_DECIDE);CHKERRQ(ierr);

  ierr = MatSetFromOptions(m);CHKERRQ(ierr);
  ierr = VecSetFromOptions(v);CHKERRQ(ierr);
  ierr = MatSetUp(m);CHKERRQ(ierr);

  if (rank == 1) {
PetscInt idxm[4] = {0, 1, 2, 3};
PetscScalar values[12] = {1, 0, 0, 1, 0, 1, 1, 1, 0, 1, 1, 1};

ierr = MatSetValues(m, 4, idxm, 3, idxn, values, INSERT_VALUES);CHKERRQ(ierr);
ierr = VecSetValue(v, 0, 1.1, INSERT_VALUES); VecSetValue(v, 1, 2.5, INSERT_VALUES);CHKERRQ(ierr);
ierr = VecSetValue(v, 2, 3, INSERT_VALUES); VecSetValue(v, 3, 4, INSERT_VALUES);CHKERRQ(ierr);
  }
  if (rank == 2) {
PetscInt idxm[4] = {4, 5, 6, 7};
PetscScalar values[12] = {1, 2, 0, 1, 2, 1, 1, 3, 0, 1, 3, 1};

ierr = MatSetValues(m, 4, idxm, 3, idxn, values, INSERT_VALUES);CHKERRQ(ierr);
ierr = VecSetValue(v, 4, 5, INSERT_VALUES); VecSetValue(v, 5, 6, INSERT_VALUES);CHKERRQ(ierr);
ierr = VecSetValue(v, 6, 7, INSERT_VALUES); VecSetValue(v, 7, 8, INSERT_VALUES);CHKERRQ(ierr);
  }

  ierr = MatAssemblyBegin(m, MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
  ierr = MatAssemblyEnd(m, MAT_FINAL_ASSEMBLY);CHKERRQ(ierr);
  ierr = VecAssemblyBegin(v);CHKERRQ(ierr);
  ierr = VecAssemblyEnd(v);CHKERRQ(ierr);
  PetscFunctionReturn(0);
}


int main(int argc, char** argv)
{
  MatQ;
  Vecv, a;
  KSPQRsolver;
  PC pc;
  PetscViewerviewerQ, viewerV;
  PetscBool  load;
  PetscErrorCode ierr;

  ierr = PetscInitialize(&argc, &argv, NULL, NULL);CHKERRQ(ierr);

  ierr = VecCreate(PETSC_COMM_WORLD, &v);CHKERRQ(ierr);
  ierr = MatCreate(PETSC_COMM_WORLD, &Q);CHKERRQ(ierr);
  ierr = MatSetType(Q, MATDENSE);CHKERRQ(ierr);
  ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, "matrixQ", FILE_MODE_READ, &viewerQ);CHKERRQ(ierr);
  ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, "in", FILE_MODE_READ, &viewerV);CHKERRQ(ierr);

  ierr = PetscOptionsHasName(NULL, NULL, "-load", &load);CHKERRQ(ierr);
  if (load) {
ierr = MatLoad(Q, viewerQ);CHKERRQ(ierr);
ierr = VecLoad(v, viewerV);CHKERRQ(ierr);
  } else {
ierr = fill(Q, v);CHKERRQ(ierr);
  }
  ierr = PetscViewerDestroy(&viewerQ);CHKERRQ(ierr);
  ierr = PetscViewerDestroy(&viewerV);CHKERRQ(ierr);

  ierr = MatCreateVecs(Q, &a, NULL);CHKERRQ(ierr);
  ierr = KSPCreate(PETSC_COMM_WORLD, &QRsolver);CHKERRQ(ierr);
  ierr = KSPGetPC(QRsolver, &pc);CHKERRQ(ierr);
  ierr = PCSetType(pc, PCNONE);CHKERRQ(ierr);
  ierr = KSPSetType(QRsolver, KSPLSQR);CHKERRQ(ierr);
  ierr = KSPSetFromOptions(QRsolver);CHKERRQ(ierr);
  ierr = KSPSetOperators(QRsolver, Q, Q);CHKERRQ(ierr);
  ierr = MatViewFromOptions(Q, NULL, "-sys_view");CHKERRQ(ierr);
  ierr = VecViewFromOptions(a, NULL, "-rhs_view");CHKERRQ(ierr);
  ierr = KSPSolve(QRsolver, v, a);CHKERRQ(ierr);
  ierr = KSPDestroy(&QRsolver);CHKERRQ(ierr);
  ierr = VecDestroy(&a);CHKERRQ(ierr);
  ierr = VecDestroy(&v);CHKERRQ(ierr);
  ierr = MatDestro

Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Wenbo Zhao
Matt

Because I am not clear about what will happen using 'preonly' for large
scale problem.

It seems to use a direct solver from below,
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPPREONLY.html

Thanks!
Wenbo

On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley  wrote:

> On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao 
> wrote:
>
>> Matt,
>> Thanks for your reply.
>> It DOES make no sense for this problem.
>> But I am not clear about the 'preonly' option. Which solver is used in
>> preonly? I wonder if 'preonly' is suitable for large scale problem such as
>> 400,000,000 unknowns.
>> So I tried 'gmres' option and found these error messages.
>>
>
> I mean, why are you setting this at all. Just do not set the coarse
> solver. The default should work fine.
>
>   Thanks,
>
> Matt
>
>
>> Could you give me some suggestions?
>>
>> Thanks.
>>
>> Wenbo
>>
>>
>> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
>> wrote:
>>
>>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao 
>>> wrote:
>>>
 Hi,

 I met some questions when I use PETSC/SLEPC to solve two-group neutron
 diffusion equations with finite difference method. The grid is 3*3*3, when
 DOF on each points is 2. So the matrix size is 54*54.
 It is generalized eigenvalue problem Ax=\lamda Bx, where B is
 diagonally dominant matrix but not symmetry.
 EPS is set as below,
  ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);¬
  ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);¬

 Krylovschur is used as eps sovler. GAMG is used as PC.
 I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and
 preonly is OK.

>>>
>>> Why are you setting the coarse solver. This makes no sense.
>>>
>>>Thanks,
>>>
>>> Matt
>>>
>>>

 Test 1
 $ make NCORE=1 runkr_nonsmooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1

 Test 2
 $ make NCORE=1 runkr_smooth
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
 makefile:43: recipe for target 'runkr_smooth' failed
 make: *** [runkr_smooth] Error 91

 Test 3
 $ make NCORE=1 runkr_gmres
 mpirun -n 1 ./step-41 \
-st_ksp_type gmres  \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
 -st_mg_coarse_ksp_rtol 1.0e-6 \
-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
 makefile:59: recipe for target 'runkr_gmres' failed
 make: *** [runkr_gmres] Error 91

 Log files were attched.
 The matrix file were also attched as AMAT.dat and BMAT.dat.

 Is it correct? Or something wrong with my code or commad-line?

 Thanks!

 Wenbo

>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/ 
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 
>


[petsc-users] Mat/Vec with empty ranks

2017-10-02 Thread Florian Lindner
Hello,

I have a matrix and vector that live on 4 ranks, but only rank 2 and 3 have 
values:

e.g.

Vec Object: 4 MPI processes
  type: mpi
Process [0]
Process [1]
1.1
2.5
3.
4.
Process [2]
5.
6.
7.
8.
Process [3]


Doing a simple LSQR solve does not converge. However, when the values are 
distributed equally, it converges within 3
iterations.

What can I do about that?

I have attached a simple program and creates the matrix and vector or loads 
them from a file.

Thanks,
Florian


in
Description: Binary data


in.info
Description: application/info


matrixQ
Description: Binary data


matrixQ.info
Description: application/info
#include 
#include 
#include 


// Mat Q:
// 1.e+00 0.e+00 0.e+00 
// 1.e+00 0.e+00 1.e+00 
// 1.e+00 1.e+00 0.e+00 
// 1.e+00 1.e+00 1.e+00

// 1.e+00 2.e+00 0.e+00 
// 1.e+00 2.e+00 1.e+00 
// 1.e+00 3.e+00 0.e+00 
// 1.e+00 3.e+00 1.e+00 

void fill(Mat &m, Vec &v) {
  int rank;
  MPI_Comm_rank(MPI_COMM_WORLD, &rank);

  int localRows = 0;
  if (rank == 1 or rank == 2)
localRows = 4;

  MatSetSizes(m, localRows, PETSC_DECIDE, PETSC_DECIDE, 3);
  VecSetSizes(v, localRows, PETSC_DECIDE);
  
  MatSetFromOptions(m);
  VecSetFromOptions(v);
  MatSetUp(m);

  PetscInt idxn[] = {0, 1, 2};
  if (rank == 1) {
PetscInt idxm[] = {0, 1, 2, 3};
PetscScalar values[] = {1, 0, 0, 1, 0, 1, 1, 1, 0, 1, 1, 1};
MatSetValues(m, 4, idxm, 3, idxn, values, INSERT_VALUES);
VecSetValue(v, 0, 1.1, INSERT_VALUES); VecSetValue(v, 1, 2.5, INSERT_VALUES);
VecSetValue(v, 2, 3, INSERT_VALUES); VecSetValue(v, 3, 4, INSERT_VALUES);
  }
  if (rank == 2) {
PetscInt idxm[] = {0, 1, 2, 3};
PetscScalar values[] = {1, 2, 0, 1, 2, 1, 1, 3, 0, 1, 3, 1};
MatSetValues(m, 4, idxm, 3, idxn, values, INSERT_VALUES);
VecSetValue(v, 4, 5, INSERT_VALUES); VecSetValue(v, 5, 6, INSERT_VALUES);
VecSetValue(v, 6, 7, INSERT_VALUES); VecSetValue(v, 7, 8, INSERT_VALUES);
  }

  MatAssemblyBegin(m, MAT_FINAL_ASSEMBLY);
  MatAssemblyEnd(m, MAT_FINAL_ASSEMBLY);
  VecAssemblyBegin(v);
  VecAssemblyEnd(v);
}


int main(int argc, char** argv)
{
  PetscInitialize(&argc, &argv, "", nullptr);
  
  Mat Q;
  Vec v, a;
  PetscViewer viewerQ, viewerV;

  VecCreate(PETSC_COMM_WORLD, &v);
  MatCreate(PETSC_COMM_WORLD, &Q);
  MatSetType(Q, MATDENSE);
  PetscViewerBinaryOpen(PETSC_COMM_WORLD, "matrixQ", FILE_MODE_READ, &viewerQ);
  PetscViewerBinaryOpen(PETSC_COMM_WORLD, "in", FILE_MODE_READ, &viewerV);

  // Comment / uncomment here
  // MatLoad(Q, viewerQ);
  // VecLoad(v, viewerV);
  fill(Q, v);
  
  MatCreateVecs(Q, &a, nullptr);
  KSP QRsolver;
  PC pc;
  KSPCreate(PETSC_COMM_WORLD, &QRsolver);
  KSPGetPC(QRsolver, &pc);
  PCSetType(pc, PCNONE);
  KSPSetType(QRsolver, KSPLSQR);
  KSPSetOperators(QRsolver, Q, Q);
  KSPSolve(QRsolver, v, a);

  PetscFinalize();
  return 0;
}


Re: [petsc-users] Load distributed matrices from directory

2017-10-02 Thread Matthew Knepley
On Mon, Oct 2, 2017 at 4:12 AM, Matthieu Vitse 
wrote:

>
> Le 29 sept. 2017 à 17:43, Barry Smith  a écrit :
>
>  Or is your matrix generator code sequential and cannot generate the full
> matrix so you want to generate chunks at a time and save to disk then load
> them? Better for you to refactor your code to work in parallel in
> generating the whole thing (since you can already generate parts the
> refactoring shouldn't be terribly difficult).
>
>
> Thanks for your answer.
>
> The matrix is already generated in parallel, but we want to keep control
> on the decomposition which conflicts with directly using PCASM.
>

Please explain this statement with an example. When using MatLoad(), you
are in control of the partitions, although not of the row order.
Also, I am confused by your use of the word "distributed". We use it to
mean an object, like a Mat that exists on several processes in a
coordinated way.

  Thanks,

Matt


> That’s why we would really like to work only with the distributed
> matrices. Are there some issues that would prevent me from doing that ?
> Moreover, ASM is a first step, we would like then to use those matrices for
> multi-preconditioning our problem, and take into account MPCs (as a
> consequence we really need to know the decomposition).
>
> Thanks,
>
> —
> Matt
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Issue of mg_coarse_ksp not converge

2017-10-02 Thread Matthew Knepley
On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao  wrote:

> Matt,
> Thanks for your reply.
> It DOES make no sense for this problem.
> But I am not clear about the 'preonly' option. Which solver is used in
> preonly? I wonder if 'preonly' is suitable for large scale problem such as
> 400,000,000 unknowns.
> So I tried 'gmres' option and found these error messages.
>

I mean, why are you setting this at all. Just do not set the coarse solver.
The default should work fine.

  Thanks,

Matt


> Could you give me some suggestions?
>
> Thanks.
>
> Wenbo
>
>
> On Mon, Oct 2, 2017 at 12:34 AM, Matthew Knepley 
> wrote:
>
>> On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao 
>> wrote:
>>
>>> Hi,
>>>
>>> I met some questions when I use PETSC/SLEPC to solve two-group neutron
>>> diffusion equations with finite difference method. The grid is 3*3*3, when
>>> DOF on each points is 2. So the matrix size is 54*54.
>>> It is generalized eigenvalue problem Ax=\lamda Bx, where B is diagonally
>>> dominant matrix but not symmetry.
>>> EPS is set as below,
>>>  ierr = EPSSetProblemType(eps,EPS_GNHEP);CHKERRQ(ierr);¬
>>>  ierr = EPSSetWhichEigenpairs(eps,EPS_LARGEST_REAL);CHKERRQ(ierr);¬
>>>
>>> Krylovschur is used as eps sovler. GAMG is used as PC.
>>> I tried agg_nsmooths and mg_coarse_ksp_type. Only non-smooths and
>>> preonly is OK.
>>>
>>
>> Why are you setting the coarse solver. This makes no sense.
>>
>>Thanks,
>>
>> Matt
>>
>>
>>>
>>> Test 1
>>> $ make NCORE=1 runkr_nonsmooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_nonsmooth 2>&1
>>>
>>> Test 2
>>> $ make NCORE=1 runkr_smooth
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type preonly   -st_mg_coarse_ksp_monitor  \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor -log_view > log_smooth 2>&1
>>> makefile:43: recipe for target 'runkr_smooth' failed
>>> make: *** [runkr_smooth] Error 91
>>>
>>> Test 3
>>> $ make NCORE=1 runkr_gmres
>>> mpirun -n 1 ./step-41 \
>>>-st_ksp_type gmres  \
>>>-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
>>>-st_ksp_view  -mata AMAT.dat -matb BMAT.dat \
>>>-st_mg_coarse_ksp_type gmres  -st_mg_coarse_ksp_monitor
>>> -st_mg_coarse_ksp_rtol 1.0e-6 \
>>>-eps_nev 1 -eps_ncv 10  -eps_monitor  -log_view > log_gmres 2>&1
>>> makefile:59: recipe for target 'runkr_gmres' failed
>>> make: *** [runkr_gmres] Error 91
>>>
>>> Log files were attched.
>>> The matrix file were also attched as AMAT.dat and BMAT.dat.
>>>
>>> Is it correct? Or something wrong with my code or commad-line?
>>>
>>> Thanks!
>>>
>>> Wenbo
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ 
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Load distributed matrices from directory

2017-10-02 Thread Matthieu Vitse

> Le 29 sept. 2017 à 17:43, Barry Smith  > a écrit :
> 
>  Or is your matrix generator code sequential and cannot generate the full 
> matrix so you want to generate chunks at a time and save to disk then load 
> them? Better for you to refactor your code to work in parallel in generating 
> the whole thing (since you can already generate parts the refactoring 
> shouldn't be terribly difficult).

Thanks for your answer. 

The matrix is already generated in parallel, but we want to keep control on the 
decomposition which conflicts with directly using PCASM. That’s why we would 
really like to work only with the distributed matrices. Are there some issues 
that would prevent me from doing that ? Moreover, ASM is a first step, we would 
like then to use those matrices for multi-preconditioning our problem, and take 
into account MPCs (as a consequence we really need to know the decomposition). 

Thanks, 

— 
Matt