On Tue, Oct 3, 2017 at 1:49 AM, Mark Adams wrote:
> Well that is strange, the PETSc tests work.
>
> Wenbo, could you please:
>
> > git clone -b gamg-fix-eig-err https://bitbucket.org/petsc/petsc petsc2
> > cd petsc2
>
> and reconfigure, make, and then run your test without the
> -st_gamg_est_ksp_
Well that is strange, the PETSc tests work.
Wenbo, could you please:
> git clone -b gamg-fix-eig-err https://bitbucket.org/petsc/petsc petsc2
> cd petsc2
and reconfigure, make, and then run your test without the
-st_gamg_est_ksp_error_if_not_converged 0 fix, and see if this fixes the
problem.
D
On Tue, Oct 3, 2017 at 12:23 AM, Matthew Knepley wrote:
> On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao
> wrote:
>
>>
>>
>> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote:
>>
>>> Whenbo, do you build your PETSc?
>>>
>>> Yes.
>> My configure option is listed below
>> ./configure --with-mpi=1 --
On Mon, Oct 2, 2017 at 11:56 AM, Wenbo Zhao
wrote:
>
>
> On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote:
>
>> Whenbo, do you build your PETSc?
>>
>> Yes.
> My configure option is listed below
> ./configure --with-mpi=1 --with-shared-libraries=1 \
> --with-64-bit-indices=1 --with-d
On Mon, Oct 2, 2017 at 11:49 PM, Mark Adams wrote:
> Whenbo, do you build your PETSc?
>
> Yes.
My configure option is listed below
./configure --with-mpi=1 --with-shared-libraries=1 \
--with-64-bit-indices=1 --with-debugging=1
And I set PETSC_DIR, PETSC_ARCH and SLEPC_DIR in my ~/.ba
Whenbo, do you build your PETSc?
On Mon, Oct 2, 2017 at 11:45 AM, Mark Adams wrote:
> This is normal:
>
> Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations
> 10
>
> It looks like ksp->errorifnotconverged got set somehow. If the default
> changed in KSP then (SAGG) GAMG wo
This is normal:
Linear st_gamg_est_ solve did not converge due to DIVERGED_ITS iterations 10
It looks like ksp->errorifnotconverged got set somehow. If the default
changed in KSP then (SAGG) GAMG would not ever work.
I assume you don't have a .petscrc file with more (crazy) options in it ...
O
On Mon, Oct 2, 2017 at 11:30 PM, Matthew Knepley wrote:
> On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote:
>
>> non-smoothed aggregation is converging very fast. smoothed fails in the
>> eigen estimator.
>>
>> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
>> and see i
On Mon, Oct 2, 2017 at 11:15 AM, Mark Adams wrote:
> non-smoothed aggregation is converging very fast. smoothed fails in the
> eigen estimator.
>
> Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor,
> and see if you get more output (I'm not 100% sure about these args).
>
I a
It seems to solve fine but then fails on this:
ierr = PetscObjectSAWsBlock((PetscObject)ksp);CHKERRQ(ierr);
if (ksp->errorifnotconverged && ksp->reason < 0)
SETERRQ(comm,PETSC_ERR_NOT_CONVERGED,"KSPSolve has not converged");
It looks like somehow ksp->errorifnotconverged got set.
On Mon, O
On Mon, Oct 2, 2017 at 11:21 AM, Mark Adams wrote:
> Yea, it fails in the eigen estimator, but the Cheby eigen estimator works
> in the solve that works:
>
> eigenvalue estimates used: min = 0.14, max = 1.10004
> eigenvalues estimate via gmres min 0.0118548, max 1.4
>
> W
I get more output
zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_smooth
mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1 \
-mata AMAT.dat -matb BMAT.dat \
-st_gamg_est_ksp_view -st_
Yea, it fails in the eigen estimator, but the Cheby eigen estimator works
in the solve that works:
eigenvalue estimates used: min = 0.14, max = 1.10004
eigenvalues estimate via gmres min 0.0118548, max 1.4
Why would it just give "KSPSolve has not converged". It is not sup
non-smoothed aggregation is converging very fast. smoothed fails in the
eigen estimator.
Run this again with -st_gamg_est_ksp_view and -st_gamg_est_ksp_monitor, and
see if you get more output (I'm not 100% sure about these args).
On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao
wrote:
> Matt,
>
>
On Mon, Oct 2, 2017 at 11:06 AM, Wenbo Zhao
wrote:
> Matt,
>
Thanks Wenbo.
> Test 1 nonsmooth
> zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
> mpirun -n 1 ./step-41 \
>-st_ksp_type gmres -st_ksp_view -st_ksp_monitor \
>-st_pc_type gamg -st_pc_gamg_type agg
Matt,
Test 1 nonsmooth
zhaowenbo@ubuntu:~/test_slepc/SPARK/spark$ make NCORE=1 runkr_nonsmooth
mpirun -n 1 ./step-41 \
-st_ksp_type gmres -st_ksp_view -st_ksp_monitor \
-st_pc_type gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 0 \
-mata AMAT.dat -matb BMAT.dat \
-eps_nev 1 -eps_
On Mon, Oct 2, 2017 at 10:43 AM, Wenbo Zhao
wrote:
> Mark,
>
> Thanks for your reply.
>
> On Mon, Oct 2, 2017 at 9:51 PM, Mark Adams wrote:
>
>> Please send the output with -st_ksp_view and -st_ksp_monitor and we can
>> start to debug it.
>>
>> Test 1 with nonsmooth and preonly is OK
> zhaowenbo
Please send the output with -st_ksp_view and -st_ksp_monitor and we can
start to debug it.
You mentioned that B is not symmetric. I assume it is elliptic (diffusion).
Where does the asymmetry come from?
On Mon, Oct 2, 2017 at 9:39 AM, Wenbo Zhao wrote:
> Matt,
> Thanks for your reply.
> For the
Matt,
Thanks for your reply.
For the defalt option doesnt work firstly( -st_ksp_type gmres -st_pc_type
gamg -st_pc_gamg_type agg -st_pc_gamg_agg_nsmooths 1), I tried to test
those options.
Wenbo
On Mon, Oct 2, 2017 at 9:08 PM, Matthew Knepley wrote:
> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao
GAMG will coarsen the problem until it is small and fast to solve with a
direct solver (LU). You can use preonly if you have a perfect
preconditioner.
On Mon, Oct 2, 2017 at 9:08 AM, Matthew Knepley wrote:
> On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao
> wrote:
>
>> Matt
>>
>> Because I am not cl
On Mon, Oct 2, 2017 at 8:30 AM, Wenbo Zhao wrote:
> Matt
>
> Because I am not clear about what will happen using 'preonly' for large
> scale problem.
>
The size of the problem has nothing to do with 'preonly'. All it means is
to apply a preconditioner without a Krylov solver.
> It seems to use
Matt
Because I am not clear about what will happen using 'preonly' for large
scale problem.
It seems to use a direct solver from below,
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPPREONLY.html
Thanks!
Wenbo
On Mon, Oct 2, 2017 at 5:09 PM, Matthew Knepley wrote:
> On Sun
On Sun, Oct 1, 2017 at 9:53 PM, Wenbo Zhao wrote:
> Matt,
> Thanks for your reply.
> It DOES make no sense for this problem.
> But I am not clear about the 'preonly' option. Which solver is used in
> preonly? I wonder if 'preonly' is suitable for large scale problem such as
> 400,000,000 unknowns
Matt,
Thanks for your reply.
It DOES make no sense for this problem.
But I am not clear about the 'preonly' option. Which solver is used in
preonly? I wonder if 'preonly' is suitable for large scale problem such as
400,000,000 unknowns.
So I tried 'gmres' option and found these error messages.
Cou
On Sun, Oct 1, 2017 at 6:49 AM, Wenbo Zhao wrote:
> Hi,
>
> I met some questions when I use PETSC/SLEPC to solve two-group neutron
> diffusion equations with finite difference method. The grid is 3*3*3, when
> DOF on each points is 2. So the matrix size is 54*54.
> It is generalized eigenvalue pr
25 matches
Mail list logo