Re: [petsc-users] GAMG failure

2023-03-28 Thread Mark Adams
On Tue, Mar 28, 2023 at 12:38 PM Blaise Bourdin  wrote:

>
>
> On Mar 27, 2023, at 9:11 PM, Mark Adams  wrote:
>
> Yes, the eigen estimates are converging slowly.
>
> BTW, have you tried hypre? It is a good solver (lots lots more woman years)
> These eigen estimates are conceptually simple, but they can lead to
> problems like this (hypre and an eigen estimate free smoother).
>
> I just moved from petsc 3.3 to main, so my experience with an old version
> of hyper has not been very convincing. Strangely enough, ML has always been
> the most efficient PC for me.
>

ML is a good solver.


> Maybe it’s time to revisit.
> That said, I would really like to get decent performances out of gamg. One
> day, I’d like to be able to account for the special structure of
> phase-field fracture in the construction of the coarse space.
>
>
> But try this (good to have options anyway):
>
> -pc_gamg_esteig_ksp_max_it 20
>
> Chevy will scale the estimate that we give by, I think, 5% by default.
> Maybe 10.
> You can set that with:
>
> -mg_levels_ksp_chebyshev_esteig 0,0.2,0,*1.05*
>
> 0.2 is the scaling of the high eigen estimate for the low eigen value in
> Chebyshev.
>
>
>
> Jed’s suggestion of using -pc_gamg_reuse_interpolation 0 worked.
>

OK, have to admit I am surprised.
But I guess with your fracture the matrix/physics/dynamics does change a lot


> I am testing your options at the moment.
>

There are a lot of options and it is cumbersome but they are finite and
good to know.
Glad its working,


>
> Thanks a lot,
>
> Blaise
>
> —
> Canada Research Chair in Mathematical and Computational Aspects of Solid
> Mechanics (Tier 1)
> Professor, Department of Mathematics & Statistics
> Hamilton Hall room 409A, McMaster University
> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243
>
>


Re: [petsc-users] GAMG failure

2023-03-28 Thread Jed Brown
This suite has been good for my solid mechanics solvers. (It's written here as 
a coarse grid solver because we do matrix-free p-MG first, but you can use it 
directly.)

https://github.com/hypre-space/hypre/issues/601#issuecomment-1069426997

Blaise Bourdin  writes:

>  On Mar 27, 2023, at 9:11 PM, Mark Adams  wrote:
>
>  Yes, the eigen estimates are converging slowly. 
>
>  BTW, have you tried hypre? It is a good solver (lots lots more woman years)
>  These eigen estimates are conceptually simple, but they can lead to problems 
> like this (hypre and an eigen estimate free
>  smoother).
>
> I just moved from petsc 3.3 to main, so my experience with an old version of 
> hyper has not been very convincing. Strangely
> enough, ML has always been the most efficient PC for me. Maybe it’s time to 
> revisit.
> That said, I would really like to get decent performances out of gamg. One 
> day, I’d like to be able to account for the special structure
> of phase-field fracture in the construction of the coarse space.
>
>  But try this (good to have options anyway):
>
>  -pc_gamg_esteig_ksp_max_it 20
>
>  Chevy will scale the estimate that we give by, I think, 5% by default. Maybe 
> 10.
>  You can set that with:
>
>  -mg_levels_ksp_chebyshev_esteig 0,0.2,0,1.05
>
>  0.2 is the scaling of the high eigen estimate for the low eigen value in 
> Chebyshev.
>
> Jed’s suggestion of using -pc_gamg_reuse_interpolation 0 worked. I am testing 
> your options at the moment.
>
> Thanks a lot,
>
> Blaise
>
> — 
> Canada Research Chair in Mathematical and Computational Aspects of Solid 
> Mechanics (Tier 1)
> Professor, Department of Mathematics & Statistics
> Hamilton Hall room 409A, McMaster University
> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243


Re: [petsc-users] GAMG failure

2023-03-28 Thread Blaise Bourdin






On Mar 27, 2023, at 9:11 PM, Mark Adams  wrote:


Yes, the eigen estimates are converging slowly.


BTW, have you tried hypre? It is a good solver (lots lots more woman years)
These eigen estimates are conceptually simple, but they can lead to problems like this (hypre and an eigen estimate free smoother).



I just moved from petsc 3.3 to main, so my experience with an old version of hyper has not been very convincing. Strangely enough, ML has always been the most efficient PC for me. Maybe it’s time to revisit.
That said, I would really like to get decent performances out of gamg. One day, I’d like to be able to account for the special structure of phase-field fracture in the construction of the coarse space.






But try this (good to have options anyway):


-pc_gamg_esteig_ksp_max_it 20



Chevy will scale the estimate that we give by, I think, 5% by default. Maybe 10.
You can set that with:


-mg_levels_ksp_chebyshev_esteig 0,0.2,0,1.05



0.2 is the scaling of the high eigen estimate for the low eigen value in Chebyshev.







Jed’s suggestion of using -pc_gamg_reuse_interpolation 0 worked. I am testing your options at the moment.


Thanks a lot,


Blaise


— 
















Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243





















Re: [petsc-users] GAMG failure

2023-03-27 Thread Mark Adams
Yes, the eigen estimates are converging slowly.

BTW, have you tried hypre? It is a good solver (lots lots more woman years)
These eigen estimates are conceptually simple, but they can lead to
problems like this (hypre and an eigen estimate free smoother).

But try this (good to have options anyway):

-pc_gamg_esteig_ksp_max_it 20

Chevy will scale the estimate that we give by, I think, 5% by default.
Maybe 10.
You can set that with:

-mg_levels_ksp_chebyshev_esteig 0,0.2,0,*1.05*

0.2 is the scaling of the high eigen estimate for the low eigen value in
Chebyshev.


On Mon, Mar 27, 2023 at 5:06 PM Blaise Bourdin  wrote:

>
>
> On Mar 24, 2023, at 3:21 PM, Mark Adams  wrote:
>
> * Do you set:
>
> PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));
>
> PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));
>
>
> Yes
>
>
> Do that to get CG Eigen estimates. Outright failure is usually caused by a
> bad Eigen estimate.
> -pc_gamg_esteig_ksp_monitor_singular_value
> Will print out the estimates as its iterating. You can look at that to
> check that the max has converged.
>
>
> I just did, and something is off:
> I do multiple calls to SNESSolve (staggered scheme for phase-field
> fracture), but only get informations on the first solve (which is not the
> one failing, of course)
> Here is what I get:
> Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 7.636421712860e+01 % max 1.e+00 min
> 1.e+00 max/min 1.e+00
>   1 KSP Residual norm 3.402024867977e+01 % max 1.114319928921e+00 min
> 1.114319928921e+00 max/min 1.e+00
>   2 KSP Residual norm 2.124815079671e+01 % max 1.501143586520e+00 min
> 5.739351119078e-01 max/min 2.615528402732e+00
>   3 KSP Residual norm 1.581785698912e+01 % max 1.644351137983e+00 min
> 3.263683482596e-01 max/min 5.038329074347e+00
>   4 KSP Residual norm 1.254871990315e+01 % max 1.714668863819e+00 min
> 2.044075812142e-01 max/min 8.388479789416e+00
>   5 KSP Residual norm 1.051198229090e+01 % max 1.760078533063e+00 min
> 1.409327403114e-01 max/min 1.248878386367e+01
>   6 KSP Residual norm 9.061658306086e+00 % max 1.792995287686e+00 min
> 1.023484740555e-01 max/min 1.751853463603e+01
>   7 KSP Residual norm 8.015529297567e+00 % max 1.821497535985e+00 min
> 7.818018001928e-02 max/min 2.329871248104e+01
>   8 KSP Residual norm 7.201063258957e+00 % max 1.855140071935e+00 min
> 6.178572472468e-02 max/min 3.002538337458e+01
>   9 KSP Residual norm 6.548491711695e+00 % max 1.903578294573e+00 min
> 5.008612895206e-02 max/min 3.800609738466e+01
>  10 KSP Residual norm 6.002109992255e+00 % max 1.961356890125e+00 min
> 4.130572033722e-02 max/min 4.748390475004e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 2.373573910237e+02 % max 1.e+00 min
> 1.e+00 max/min 1.e+00
>   1 KSP Residual norm 8.845061415709e+01 % max 1.081192207576e+00 min
> 1.081192207576e+00 max/min 1.e+00
>   2 KSP Residual norm 5.607525485152e+01 % max 1.345947059840e+00 min
> 5.768825326129e-01 max/min 2.333138869267e+00
>   3 KSP Residual norm 4.123522550864e+01 % max 1.481153523075e+00 min
> 3.070603564913e-01 max/min 4.823655974348e+00
>   4 KSP Residual norm 3.345765664017e+01 % max 1.551374710727e+00 min
> 1.953487694959e-01 max/min 7.941563771968e+00
>   5 KSP Residual norm 2.859712984893e+01 % max 1.604588395452e+00 min
> 1.313871480574e-01 max/min 1.221267391199e+01
>   6 KSP Residual norm 2.525636054248e+01 % max 1.650487481750e+00 min
> 9.322735730688e-02 max/min 1.770389646804e+01
>   7 KSP Residual norm 2.270711391451e+01 % max 1.697243639599e+00 min
> 6.945419058256e-02 max/min 2.443687883140e+01
>   8 KSP Residual norm 2.074739485241e+01 % max 1.737293728907e+00 min
> 5.319942519758e-02 max/min 3.265624999621e+01
>   9 KSP Residual norm 1.912808268870e+01 % max 1.771708608618e+00 min
> 4.229776586667e-02 max/min 4.188657656771e+01
>  10 KSP Residual norm 1.787394414641e+01 % max 1.802834420843e+00 min
> 3.460455235448e-02 max/min 5.209818645753e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 1.361990679391e+03 % max 1.e+00 min
> 1.e+00 max/min 1.e+00
>   1 KSP Residual norm 5.377188333825e+02 % max 1.086812916769e+00 min
> 1.086812916769e+00 max/min 1.e+00
>   2 KSP Residual norm 2.819790765047e+02 % max 1.474233179517e+00 min
> 6.475176340551e-01 max/min 2.276745994212e+00
>   3 KSP Residual norm 1.856720658591e+02 % max 1.646049713883e+00 min
> 4.391851040105e-01 max/min 3.747963441500e+00
>   4 KSP Residual norm 1.446507859917e+02 % max 1.760403013135e+00 min
> 2.972886103795e-01 max/min 5.921528614526e+00
>   5 KSP Residual norm 1.212491636433e+02 % max 1.839250080524e+00 min
> 1.921591413785e-01 max/min 9.571494061277e+00
>   6 KSP Residual norm 1.052783637696e+02 % max 1.887062042760e+00 min
> 1.275920366984e-01 max/min 1.478981048966e+01
>   7 KSP 

Re: [petsc-users] GAMG failure

2023-03-27 Thread Jed Brown
Try -pc_gamg_reuse_interpolation 0. I thought this was disabled by default, but 
I see pc_gamg->reuse_prol = PETSC_TRUE in the code.

Blaise Bourdin  writes:

>  On Mar 24, 2023, at 3:21 PM, Mark Adams  wrote:
>
>  * Do you set: 
>
>  PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));
>
>  PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));
>
> Yes
>
>  Do that to get CG Eigen estimates. Outright failure is usually caused by a 
> bad Eigen estimate.
>  -pc_gamg_esteig_ksp_monitor_singular_value
>  Will print out the estimates as its iterating. You can look at that to check 
> that the max has converged.
>
> I just did, and something is off:
> I do multiple calls to SNESSolve (staggered scheme for phase-field fracture), 
> but only get informations on the first solve (which is
> not the one failing, of course)
> Here is what I get:
> Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 7.636421712860e+01 % max 1.e+00 min 
> 1.e+00 max/min
> 1.e+00
>   1 KSP Residual norm 3.402024867977e+01 % max 1.114319928921e+00 min 
> 1.114319928921e+00 max/min
> 1.e+00
>   2 KSP Residual norm 2.124815079671e+01 % max 1.501143586520e+00 min 
> 5.739351119078e-01 max/min
> 2.615528402732e+00
>   3 KSP Residual norm 1.581785698912e+01 % max 1.644351137983e+00 min 
> 3.263683482596e-01 max/min
> 5.038329074347e+00
>   4 KSP Residual norm 1.254871990315e+01 % max 1.714668863819e+00 min 
> 2.044075812142e-01 max/min
> 8.388479789416e+00
>   5 KSP Residual norm 1.051198229090e+01 % max 1.760078533063e+00 min 
> 1.409327403114e-01 max/min
> 1.248878386367e+01
>   6 KSP Residual norm 9.061658306086e+00 % max 1.792995287686e+00 min 
> 1.023484740555e-01 max/min
> 1.751853463603e+01
>   7 KSP Residual norm 8.015529297567e+00 % max 1.821497535985e+00 min 
> 7.818018001928e-02 max/min
> 2.329871248104e+01
>   8 KSP Residual norm 7.201063258957e+00 % max 1.855140071935e+00 min 
> 6.178572472468e-02 max/min
> 3.002538337458e+01
>   9 KSP Residual norm 6.548491711695e+00 % max 1.903578294573e+00 min 
> 5.008612895206e-02 max/min
> 3.800609738466e+01
>  10 KSP Residual norm 6.002109992255e+00 % max 1.961356890125e+00 min 
> 4.130572033722e-02 max/min
> 4.748390475004e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 2.373573910237e+02 % max 1.e+00 min 
> 1.e+00 max/min
> 1.e+00
>   1 KSP Residual norm 8.845061415709e+01 % max 1.081192207576e+00 min 
> 1.081192207576e+00 max/min
> 1.e+00
>   2 KSP Residual norm 5.607525485152e+01 % max 1.345947059840e+00 min 
> 5.768825326129e-01 max/min
> 2.333138869267e+00
>   3 KSP Residual norm 4.123522550864e+01 % max 1.481153523075e+00 min 
> 3.070603564913e-01 max/min
> 4.823655974348e+00
>   4 KSP Residual norm 3.345765664017e+01 % max 1.551374710727e+00 min 
> 1.953487694959e-01 max/min
> 7.941563771968e+00
>   5 KSP Residual norm 2.859712984893e+01 % max 1.604588395452e+00 min 
> 1.313871480574e-01 max/min
> 1.221267391199e+01
>   6 KSP Residual norm 2.525636054248e+01 % max 1.650487481750e+00 min 
> 9.322735730688e-02 max/min
> 1.770389646804e+01
>   7 KSP Residual norm 2.270711391451e+01 % max 1.697243639599e+00 min 
> 6.945419058256e-02 max/min
> 2.443687883140e+01
>   8 KSP Residual norm 2.074739485241e+01 % max 1.737293728907e+00 min 
> 5.319942519758e-02 max/min
> 3.265624999621e+01
>   9 KSP Residual norm 1.912808268870e+01 % max 1.771708608618e+00 min 
> 4.229776586667e-02 max/min
> 4.188657656771e+01
>  10 KSP Residual norm 1.787394414641e+01 % max 1.802834420843e+00 min 
> 3.460455235448e-02 max/min
> 5.209818645753e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 1.361990679391e+03 % max 1.e+00 min 
> 1.e+00 max/min
> 1.e+00
>   1 KSP Residual norm 5.377188333825e+02 % max 1.086812916769e+00 min 
> 1.086812916769e+00 max/min
> 1.e+00
>   2 KSP Residual norm 2.819790765047e+02 % max 1.474233179517e+00 min 
> 6.475176340551e-01 max/min
> 2.276745994212e+00
>   3 KSP Residual norm 1.856720658591e+02 % max 1.646049713883e+00 min 
> 4.391851040105e-01 max/min
> 3.747963441500e+00
>   4 KSP Residual norm 1.446507859917e+02 % max 1.760403013135e+00 min 
> 2.972886103795e-01 max/min
> 5.921528614526e+00
>   5 KSP Residual norm 1.212491636433e+02 % max 1.839250080524e+00 min 
> 1.921591413785e-01 max/min
> 9.571494061277e+00
>   6 KSP Residual norm 1.052783637696e+02 % max 1.887062042760e+00 min 
> 1.275920366984e-01 max/min
> 1.478981048966e+01
>   7 KSP Residual norm 9.230292625762e+01 % max 1.917891358356e+00 min 
> 8.853577120467e-02 max/min
> 2.166233300122e+01
>   8 KSP Residual norm 8.262607594297e+01 % max 1.935857204308e+00 min 
> 6.706949937710e-02 max/min
> 2.886345093206e+01
>   9 KSP Residual norm 7.616474911000e+01 % max 1.946323901431e+00 min 
> 5.354310733090e-02 max/min
> 3.635059671458e+01
>  10 KSP Residual norm 

Re: [petsc-users] GAMG failure

2023-03-27 Thread Blaise Bourdin






On Mar 24, 2023, at 3:21 PM, Mark Adams  wrote:



* Do you set:


    PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));








    PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));







Yes







Do that to get CG Eigen estimates. Outright failure is usually caused by a bad Eigen estimate.
-pc_gamg_esteig_ksp_monitor_singular_value

Will print out the estimates as its iterating. You can look at that to check that the max has converged.






I just did, and something is off:
I do multiple calls to SNESSolve (staggered scheme for phase-field fracture), but only get informations on the first solve (which is not the one failing, of course)
Here is what I get:


Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 7.636421712860e+01 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 3.402024867977e+01 % max 1.114319928921e+00 min 1.114319928921e+00 max/min 1.e+00
  2 KSP Residual norm 2.124815079671e+01 % max 1.501143586520e+00 min 5.739351119078e-01 max/min 2.615528402732e+00
  3 KSP Residual norm 1.581785698912e+01 % max 1.644351137983e+00 min 3.263683482596e-01 max/min 5.038329074347e+00
  4 KSP Residual norm 1.254871990315e+01 % max 1.714668863819e+00 min 2.044075812142e-01 max/min 8.388479789416e+00
  5 KSP Residual norm 1.051198229090e+01 % max 1.760078533063e+00 min 1.409327403114e-01 max/min 1.248878386367e+01
  6 KSP Residual norm 9.061658306086e+00 % max 1.792995287686e+00 min 1.023484740555e-01 max/min 1.751853463603e+01
  7 KSP Residual norm 8.015529297567e+00 % max 1.821497535985e+00 min 7.818018001928e-02 max/min 2.329871248104e+01
  8 KSP Residual norm 7.201063258957e+00 % max 1.855140071935e+00 min 6.178572472468e-02 max/min 3.002538337458e+01
  9 KSP Residual norm 6.548491711695e+00 % max 1.903578294573e+00 min 5.008612895206e-02 max/min 3.800609738466e+01
 10 KSP Residual norm 6.002109992255e+00 % max 1.961356890125e+00 min 4.130572033722e-02 max/min 4.748390475004e+01
  Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 2.373573910237e+02 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 8.845061415709e+01 % max 1.081192207576e+00 min 1.081192207576e+00 max/min 1.e+00
  2 KSP Residual norm 5.607525485152e+01 % max 1.345947059840e+00 min 5.768825326129e-01 max/min 2.333138869267e+00
  3 KSP Residual norm 4.123522550864e+01 % max 1.481153523075e+00 min 3.070603564913e-01 max/min 4.823655974348e+00
  4 KSP Residual norm 3.345765664017e+01 % max 1.551374710727e+00 min 1.953487694959e-01 max/min 7.941563771968e+00
  5 KSP Residual norm 2.859712984893e+01 % max 1.604588395452e+00 min 1.313871480574e-01 max/min 1.221267391199e+01
  6 KSP Residual norm 2.525636054248e+01 % max 1.650487481750e+00 min 9.322735730688e-02 max/min 1.770389646804e+01
  7 KSP Residual norm 2.270711391451e+01 % max 1.697243639599e+00 min 6.945419058256e-02 max/min 2.443687883140e+01
  8 KSP Residual norm 2.074739485241e+01 % max 1.737293728907e+00 min 5.319942519758e-02 max/min 3.265624999621e+01
  9 KSP Residual norm 1.912808268870e+01 % max 1.771708608618e+00 min 4.229776586667e-02 max/min 4.188657656771e+01
 10 KSP Residual norm 1.787394414641e+01 % max 1.802834420843e+00 min 3.460455235448e-02 max/min 5.209818645753e+01
  Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 1.361990679391e+03 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 5.377188333825e+02 % max 1.086812916769e+00 min 1.086812916769e+00 max/min 1.e+00
  2 KSP Residual norm 2.819790765047e+02 % max 1.474233179517e+00 min 6.475176340551e-01 max/min 2.276745994212e+00
  3 KSP Residual norm 1.856720658591e+02 % max 1.646049713883e+00 min 4.391851040105e-01 max/min 3.747963441500e+00
  4 KSP Residual norm 1.446507859917e+02 % max 1.760403013135e+00 min 2.972886103795e-01 max/min 5.921528614526e+00
  5 KSP Residual norm 1.212491636433e+02 % max 1.839250080524e+00 min 1.921591413785e-01 max/min 9.571494061277e+00
  6 KSP Residual norm 1.052783637696e+02 % max 1.887062042760e+00 min 1.275920366984e-01 max/min 1.478981048966e+01
  7 KSP Residual norm 9.230292625762e+01 % max 1.917891358356e+00 min 8.853577120467e-02 max/min 2.166233300122e+01
  8 KSP Residual norm 8.262607594297e+01 % max 1.935857204308e+00 min 6.706949937710e-02 max/min 2.886345093206e+01
  9 KSP Residual norm 7.616474911000e+01 % max 1.946323901431e+00 min 5.354310733090e-02 max/min 3.635059671458e+01
 10 KSP Residual norm 7.138356892221e+01 % max 1.954382723686e+00 min 4.367661484659e-02 max/min 4.474666204216e+01
  Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 3.702300162209e+03 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 1.255008322497e+03 % max 

Re: [petsc-users] GAMG failure

2023-03-24 Thread Mark Adams
* Do you set:

PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));
PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));

Do that to get CG Eigen estimates. Outright failure is usually caused by a
bad Eigen estimate.
-pc_gamg_esteig_ksp_monitor_singular_value
Will print out the estimates as its iterating. You can look at that to
check that the max has converged.

*  -pc_gamg_aggressive_coarsening 0

will slow coarsening as well as threshold.

* you can run with '-info :pc' and send me the output (grep on GAMG)

Mark

On Fri, Mar 24, 2023 at 2:47 PM Jed Brown  wrote:

> You can -pc_gamg_threshold .02 to slow the coarsening and either stronger
> smoother or increase number of iterations used for estimation (or increase
> tolerance). I assume your system is SPD and you've set the near-null space.
>
> Blaise Bourdin  writes:
>
> > Hi,
> >
> > I am having issue with GAMG for some very ill-conditioned 2D linearized
> elasticity problems (sharp variation of elastic moduli with thin  regions
> of nearly incompressible material). I use snes_type newtonls,
> linesearch_type cp, and pc_type gamg without any further options. pc_type
> Jacobi converges fine (although slowly of course).
> >
> >
> > I am not really surprised that gamg would not converge out of the box,
> but don’t know where to start to investigate the convergence failure. Can
> anybody help?
> >
> > Blaise
> >
> > —
> > Canada Research Chair in Mathematical and Computational Aspects of Solid
> Mechanics (Tier 1)
> > Professor, Department of Mathematics & Statistics
> > Hamilton Hall room 409A, McMaster University
> > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
> > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243
>


Re: [petsc-users] GAMG failure

2023-03-24 Thread Jed Brown
You can -pc_gamg_threshold .02 to slow the coarsening and either stronger 
smoother or increase number of iterations used for estimation (or increase 
tolerance). I assume your system is SPD and you've set the near-null space.

Blaise Bourdin  writes:

> Hi,
>
> I am having issue with GAMG for some very ill-conditioned 2D linearized 
> elasticity problems (sharp variation of elastic moduli with thin  regions of 
> nearly incompressible material). I use snes_type newtonls, linesearch_type 
> cp, and pc_type gamg without any further options. pc_type Jacobi converges 
> fine (although slowly of course).
>
>
> I am not really surprised that gamg would not converge out of the box, but 
> don’t know where to start to investigate the convergence failure. Can anybody 
> help?
>
> Blaise
>
> — 
> Canada Research Chair in Mathematical and Computational Aspects of Solid 
> Mechanics (Tier 1)
> Professor, Department of Mathematics & Statistics
> Hamilton Hall room 409A, McMaster University
> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243


[petsc-users] GAMG failure

2023-03-24 Thread Blaise Bourdin
Hi,

I am having issue with GAMG for some very ill-conditioned 2D linearized 
elasticity problems (sharp variation of elastic moduli with thin  regions of 
nearly incompressible material). I use snes_type newtonls, linesearch_type cp, 
and pc_type gamg without any further options. pc_type Jacobi converges fine 
(although slowly of course).


I am not really surprised that gamg would not converge out of the box, but 
don’t know where to start to investigate the convergence failure. Can anybody 
help?

Blaise

— 
Canada Research Chair in Mathematical and Computational Aspects of Solid 
Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243



Re: [petsc-users] gamg failure with petsc-dev

2014-08-06 Thread Mark Adams



 Mark, should we provide some more flexible way to label fields?  It
 will be more complicated than the present code and I think packing into
 interlaced format is faster anyway.



I'm thinking this would entail reordering the matrix and inverting this
permutation for the fine grid R  P.  This would be some work for a small
number of people.  It might be nice to be able to detect this and give a
warning but I don't see how this could be done.


Re: [petsc-users] gamg failure with petsc-dev

2014-04-01 Thread Mark Adams
Stephan, I have pushed a pull request to fix this but for now you can just
use -mg_levels_ksp_type chebyshev -mg_levels_pc_type jacobi.  This used to
be the default be we move to SOR recently.
Mark


On Sat, Mar 29, 2014 at 5:52 PM, Mark Adams mfad...@lbl.gov wrote:

 Sorry for getting to this late.  I think you have figured it out basically
 but there are a few things:

 1) You must set the block size of A (bs=2) for the null spaces to work and
 for aggregation MG to work properly. SA-AMG really does not make sense
 unless you work at the vertex level, for which we need the block size.

 2) You must be right that the zero column is because the aggregation
 produced a singleton aggregate.  And so the coarse grid is low rank.  This
 is not catastrophic, it is like a fake BC equations.  The numerics just
 have to work around it.  Jacobi does this.  I will fix SOR.

 Mark


 Ok, I found out a bit more. The fact that the prolongator has zero
 columns appears to arise in petsc 3.4 as well. The only reason it wasn't
 flagged before is that the default for the smoother (not the aggregation
 smoother but the standard pre and post smoothing) changed from jacobi to
 sor. I can make the example work with the additional option:

 $ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode
 -elas_mg_levels_1_pc_type jacobi

 Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace
 (the /* constrain near-null space bit */) at the end, it works with jacobi
 (the default in 3.4) but it breaks with sor with the same error message as
 above. I'm not entirely sure why jacobi doesn't give an error with a zero
 on the diagonal, but the zero column also means that the related coarse dof
 doesn't actually affect the fine grid solution.

 I think (but I might be barking up the wrong tree here) that the zero
 columns appear because the aggregation method typically will have a few
 small aggregates that are not big enough to support the polynomials of the
 near null space (i.e. the polynomials restricted to an aggregate are not
 linearly independent). A solution would be to reduce the number of
 polynomials for these aggregates (only take the linearly independent).
 Obviously this has the down-side that the degrees of freedom per aggregate
 at the coarse level is no longer a constant making the administration more
 complicated. It would be nice to find a solution though as I've always been
 taught that jacobi is not a robust smoother for multigrid.

 Cheers
 Stephan






Re: [petsc-users] gamg failure with petsc-dev

2014-04-01 Thread Stephan Kramer

On 01/04/14 16:07, Mark Adams wrote:

Stephan, I have pushed a pull request to fix this but for now you can just
use -mg_levels_ksp_type chebyshev -mg_levels_pc_type jacobi.  This used to
be the default be we move to SOR recently.
Mark


Ah, that's great news. Thanks a lot for the effort. You're right: the previous 
defaults should be fine for us; your fix should hopefully only improve things




On Sat, Mar 29, 2014 at 5:52 PM, Mark Adams mfad...@lbl.gov wrote:


Sorry for getting to this late.  I think you have figured it out basically
but there are a few things:

1) You must set the block size of A (bs=2) for the null spaces to work and
for aggregation MG to work properly. SA-AMG really does not make sense
unless you work at the vertex level, for which we need the block size.


Yes indeed. I've come to realize this now by looking into how smoothed aggregation with a near null space actually works. We currently have our dofs numbered the wrong way around (vertices on the 
inside, velocity component on the outside - which made sense for other eqns we solve with the model) so will take a bit of work, but might well be worth the effort


Thanks a lot for looking into this
Cheers
Stephan




2) You must be right that the zero column is because the aggregation
produced a singleton aggregate.  And so the coarse grid is low rank.  This
is not catastrophic, it is like a fake BC equations.  The numerics just
have to work around it.  Jacobi does this.  I will fix SOR.

Mark



Ok, I found out a bit more. The fact that the prolongator has zero
columns appears to arise in petsc 3.4 as well. The only reason it wasn't
flagged before is that the default for the smoother (not the aggregation
smoother but the standard pre and post smoothing) changed from jacobi to
sor. I can make the example work with the additional option:

$ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode
-elas_mg_levels_1_pc_type jacobi

Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace
(the /* constrain near-null space bit */) at the end, it works with jacobi
(the default in 3.4) but it breaks with sor with the same error message as
above. I'm not entirely sure why jacobi doesn't give an error with a zero
on the diagonal, but the zero column also means that the related coarse dof
doesn't actually affect the fine grid solution.

I think (but I might be barking up the wrong tree here) that the zero
columns appear because the aggregation method typically will have a few
small aggregates that are not big enough to support the polynomials of the
near null space (i.e. the polynomials restricted to an aggregate are not
linearly independent). A solution would be to reduce the number of
polynomials for these aggregates (only take the linearly independent).
Obviously this has the down-side that the degrees of freedom per aggregate
at the coarse level is no longer a constant making the administration more
complicated. It would be nice to find a solution though as I've always been
taught that jacobi is not a robust smoother for multigrid.

Cheers
Stephan









Re: [petsc-users] gamg failure with petsc-dev

2014-04-01 Thread Jed Brown
Stephan Kramer s.kra...@imperial.ac.uk writes:
 Yes indeed. I've come to realize this now by looking into how smoothed
 aggregation with a near null space actually works. We currently have
 our dofs numbered the wrong way around (vertices on the inside,
 velocity component on the outside - which made sense for other eqns we
 solve with the model) so will take a bit of work, but might well be
 worth the effort

The memory streaming and cache reuse is much better if you interlace the
degrees of freedom.  This is as true now as it was at the time of the
PETSc-FUN3D papers.  When evaluating the physics, it can be useful to
pack the interlaced degrees of freedom into a vector-friendly ordering.

The AMG solve is plenty expensive that you can pack/solve/unpack an
interlaced vector at negligible cost without changing the rest of your
code.

Mark, should we provide some more flexible way to label fields?  It
will be more complicated than the present code and I think packing into
interlaced format is faster anyway.


pgpJO8UvAI8e8.pgp
Description: PGP signature


Re: [petsc-users] gamg failure with petsc-dev

2014-03-29 Thread Mark Adams
Sorry for getting to this late.  I think you have figured it out basically
but there are a few things:

1) You must set the block size of A (bs=2) for the null spaces to work and
for aggregation MG to work properly. SA-AMG really does not make sense
unless you work at the vertex level, for which we need the block size.

2) You must be right that the zero column is because the aggregation
produced a singleton aggregate.  And so the coarse grid is low rank.  This
is not catastrophic, it is like a fake BC equations.  The numerics just
have to work around it.  Jacobi does this.  I will fix SOR.

Mark


 Ok, I found out a bit more. The fact that the prolongator has zero columns
 appears to arise in petsc 3.4 as well. The only reason it wasn't flagged
 before is that the default for the smoother (not the aggregation smoother
 but the standard pre and post smoothing) changed from jacobi to sor. I can
 make the example work with the additional option:

 $ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode
 -elas_mg_levels_1_pc_type jacobi

 Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace
 (the /* constrain near-null space bit */) at the end, it works with jacobi
 (the default in 3.4) but it breaks with sor with the same error message as
 above. I'm not entirely sure why jacobi doesn't give an error with a zero
 on the diagonal, but the zero column also means that the related coarse dof
 doesn't actually affect the fine grid solution.

 I think (but I might be barking up the wrong tree here) that the zero
 columns appear because the aggregation method typically will have a few
 small aggregates that are not big enough to support the polynomials of the
 near null space (i.e. the polynomials restricted to an aggregate are not
 linearly independent). A solution would be to reduce the number of
 polynomials for these aggregates (only take the linearly independent).
 Obviously this has the down-side that the degrees of freedom per aggregate
 at the coarse level is no longer a constant making the administration more
 complicated. It would be nice to find a solution though as I've always been
 taught that jacobi is not a robust smoother for multigrid.

 Cheers
 Stephan





Re: [petsc-users] gamg failure with petsc-dev

2014-03-24 Thread Stephan Kramer

On 21/03/14 11:34, Stephan Kramer wrote:

On 21/03/14 04:24, Jed Brown wrote:

Stephan Kramer s.kra...@imperial.ac.uk writes:


We have been having some problems with GAMG on petsc-dev (master) for
cases that worked fine on petsc 3.4. We're solving a Stokes equation
(just the velocity block) for a simple convection in a square box
(isoviscous). The problem only occurs if we supply a near null space
(via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and
(-y,x) (near) null space vectors. If we supply those, the smoother
complains that the diagonal of the A matrix at the first coarsened
level contains a zero. If I dump out the prolongator from the finest
to the first coarsened level it indeed contains a zero column at that
same index. We're pretty confident that the fine level A matrix is
correct (it solves fine with LU). I've briefly spoken to Matt about
this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as
the default changed from 3.4 - dev) but that didn't make any
difference, the dumped out prolongator still has zero columns, and it
crashes in the same way. Do you have any further suggestions what to
try and how to further debug this?


Do you set the block size?  Can you reproduce by modifying
src/ksp/ksp/examples/tutorials/ex49.c (plane strain elasticity)?



I don't set a block size, no. About ex49: Ah great, with master (just updated 
now) I get:

[skramer@stommel]{/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials}$ 
./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode
[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: Arguments are incompatible
[0]PETSC ERROR: Zero diagonal on row 1
[0]PETSC ERROR: See http://http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-3671-gbb161d1  GIT Date: 
2014-03-21 01:14:15 +
[0]PETSC ERROR: ./ex49 on a linux-gnu-c-opt named stommel by skramer Fri Mar 21 
11:25:55 2014
[0]PETSC ERROR: Configure options --download-fblaslapack=1 --download-blacs=1 
--download-scalapack=1 --download-ptscotch=1 --download-mumps=1 
--download-hypre=1 --download-suitesparse=1 --download-ml=1
[0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1728 in 
/data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1760 in 
/data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: #3 MatSOR() line 3734 in 
/data/stephan/git/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #4 PCApply_SOR() line 35 in 
/data/stephan/git/petsc/src/ksp/pc/impls/sor/sor.c
[0]PETSC ERROR: #5 PCApply() line 440 in 
/data/stephan/git/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #6 KSP_PCApply() line 227 in 
/data/stephan/git/petsc/include/petsc-private/kspimpl.h
[0]PETSC ERROR: #7 KSPSolve_Chebyshev() line 456 in 
/data/stephan/git/petsc/src/ksp/ksp/impls/cheby/cheby.c
[0]PETSC ERROR: #8 KSPSolve() line 458 in 
/data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #9 PCMGMCycle_Private() line 19 in 
/data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #10 PCMGMCycle_Private() line 48 in 
/data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #11 PCApply_MG() line 330 in 
/data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #12 PCApply() line 440 in 
/data/stephan/git/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #13 KSP_PCApply() line 227 in 
/data/stephan/git/petsc/include/petsc-private/kspimpl.h
[0]PETSC ERROR: #14 KSPInitialResidual() line 63 in 
/data/stephan/git/petsc/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #15 KSPSolve_GMRES() line 234 in 
/data/stephan/git/petsc/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #16 KSPSolve() line 458 in 
/data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #17 solve_elasticity_2d() line 1053 in 
/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c
[0]PETSC ERROR: #18 main() line 1104 in 
/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c
[0]PETSC ERROR: End of Error Message ---send entire error 
message to petsc-ma...@mcs.anl.gov--

Which is the same error we were getting on our problem
Cheers
Stephan




Ok, I found out a bit more. The fact that the prolongator has zero columns appears to arise in petsc 3.4 as well. The only reason it wasn't flagged before is that the default for the smoother (not the 
aggregation smoother but the standard pre and post smoothing) changed from jacobi to sor. I can make the example work with the additional option:


$ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode 
-elas_mg_levels_1_pc_type jacobi

Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace (the /* constrain near-null space bit */) at the end, it works with jacobi (the default in 3.4) but it breaks with sor with 
the same error message as above. I'm not entirely sure why jacobi doesn't 

Re: [petsc-users] gamg failure with petsc-dev

2014-03-21 Thread Stephan Kramer

On 21/03/14 04:24, Jed Brown wrote:

Stephan Kramer s.kra...@imperial.ac.uk writes:


We have been having some problems with GAMG on petsc-dev (master) for
cases that worked fine on petsc 3.4. We're solving a Stokes equation
(just the velocity block) for a simple convection in a square box
(isoviscous). The problem only occurs if we supply a near null space
(via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and
(-y,x) (near) null space vectors. If we supply those, the smoother
complains that the diagonal of the A matrix at the first coarsened
level contains a zero. If I dump out the prolongator from the finest
to the first coarsened level it indeed contains a zero column at that
same index. We're pretty confident that the fine level A matrix is
correct (it solves fine with LU). I've briefly spoken to Matt about
this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as
the default changed from 3.4 - dev) but that didn't make any
difference, the dumped out prolongator still has zero columns, and it
crashes in the same way. Do you have any further suggestions what to
try and how to further debug this?


Do you set the block size?  Can you reproduce by modifying
src/ksp/ksp/examples/tutorials/ex49.c (plane strain elasticity)?



I don't set a block size, no. About ex49: Ah great, with master (just updated 
now) I get:

[skramer@stommel]{/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials}$ 
./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode
[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: Arguments are incompatible
[0]PETSC ERROR: Zero diagonal on row 1
[0]PETSC ERROR: See http://http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-3671-gbb161d1  GIT Date: 
2014-03-21 01:14:15 +
[0]PETSC ERROR: ./ex49 on a linux-gnu-c-opt named stommel by skramer Fri Mar 21 
11:25:55 2014
[0]PETSC ERROR: Configure options --download-fblaslapack=1 --download-blacs=1 
--download-scalapack=1 --download-ptscotch=1 --download-mumps=1 
--download-hypre=1 --download-suitesparse=1 --download-ml=1
[0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1728 in 
/data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1760 in 
/data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: #3 MatSOR() line 3734 in 
/data/stephan/git/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #4 PCApply_SOR() line 35 in 
/data/stephan/git/petsc/src/ksp/pc/impls/sor/sor.c
[0]PETSC ERROR: #5 PCApply() line 440 in 
/data/stephan/git/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #6 KSP_PCApply() line 227 in 
/data/stephan/git/petsc/include/petsc-private/kspimpl.h
[0]PETSC ERROR: #7 KSPSolve_Chebyshev() line 456 in 
/data/stephan/git/petsc/src/ksp/ksp/impls/cheby/cheby.c
[0]PETSC ERROR: #8 KSPSolve() line 458 in 
/data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #9 PCMGMCycle_Private() line 19 in 
/data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #10 PCMGMCycle_Private() line 48 in 
/data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #11 PCApply_MG() line 330 in 
/data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c
[0]PETSC ERROR: #12 PCApply() line 440 in 
/data/stephan/git/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #13 KSP_PCApply() line 227 in 
/data/stephan/git/petsc/include/petsc-private/kspimpl.h
[0]PETSC ERROR: #14 KSPInitialResidual() line 63 in 
/data/stephan/git/petsc/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: #15 KSPSolve_GMRES() line 234 in 
/data/stephan/git/petsc/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: #16 KSPSolve() line 458 in 
/data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #17 solve_elasticity_2d() line 1053 in 
/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c
[0]PETSC ERROR: #18 main() line 1104 in 
/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c
[0]PETSC ERROR: End of Error Message ---send entire error 
message to petsc-ma...@mcs.anl.gov--

Which is the same error we were getting on our problem
Cheers
Stephan


[petsc-users] gamg failure with petsc-dev

2014-03-20 Thread Stephan Kramer

Hi guys,

We have been having some problems with GAMG on petsc-dev (master) for cases that worked fine on petsc 3.4. We're solving a Stokes equation (just the velocity block) for a simple convection in a square 
box (isoviscous). The problem only occurs if we supply a near null space (via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and (-y,x) (near) null space vectors. If we supply those, the 
smoother complains that the diagonal of the A matrix at the first coarsened level contains a zero. If I dump out the prolongator from the finest to the first coarsened level it indeed contains a zero 
column at that same index. We're pretty confident that the fine level A matrix is correct (it solves fine with LU). I've briefly spoken to Matt about this and he suggested trying to run with 
-pc_gamg_agg_nsmooths 0 (as the default changed from 3.4 - dev) but that didn't make any difference, the dumped out prolongator still has zero columns, and it crashes in the same way. Do you have any 
further suggestions what to try and how to further debug this?


Cheers
Stephan


Re: [petsc-users] gamg failure with petsc-dev

2014-03-20 Thread Jed Brown
Stephan Kramer s.kra...@imperial.ac.uk writes:

 We have been having some problems with GAMG on petsc-dev (master) for
 cases that worked fine on petsc 3.4. We're solving a Stokes equation
 (just the velocity block) for a simple convection in a square box
 (isoviscous). The problem only occurs if we supply a near null space
 (via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and
 (-y,x) (near) null space vectors. If we supply those, the smoother
 complains that the diagonal of the A matrix at the first coarsened
 level contains a zero. If I dump out the prolongator from the finest
 to the first coarsened level it indeed contains a zero column at that
 same index. We're pretty confident that the fine level A matrix is
 correct (it solves fine with LU). I've briefly spoken to Matt about
 this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as
 the default changed from 3.4 - dev) but that didn't make any
 difference, the dumped out prolongator still has zero columns, and it
 crashes in the same way. Do you have any further suggestions what to
 try and how to further debug this?

Do you set the block size?  Can you reproduce by modifying
src/ksp/ksp/examples/tutorials/ex49.c (plane strain elasticity)?


pgpPgaQmgzqBP.pgp
Description: PGP signature