Re: [petsc-users] GAMG failure

2023-03-27 Thread Mark Adams
Yes, the eigen estimates are converging slowly.

BTW, have you tried hypre? It is a good solver (lots lots more woman years)
These eigen estimates are conceptually simple, but they can lead to
problems like this (hypre and an eigen estimate free smoother).

But try this (good to have options anyway):

-pc_gamg_esteig_ksp_max_it 20

Chevy will scale the estimate that we give by, I think, 5% by default.
Maybe 10.
You can set that with:

-mg_levels_ksp_chebyshev_esteig 0,0.2,0,*1.05*

0.2 is the scaling of the high eigen estimate for the low eigen value in
Chebyshev.


On Mon, Mar 27, 2023 at 5:06 PM Blaise Bourdin  wrote:

>
>
> On Mar 24, 2023, at 3:21 PM, Mark Adams  wrote:
>
> * Do you set:
>
> PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));
>
> PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));
>
>
> Yes
>
>
> Do that to get CG Eigen estimates. Outright failure is usually caused by a
> bad Eigen estimate.
> -pc_gamg_esteig_ksp_monitor_singular_value
> Will print out the estimates as its iterating. You can look at that to
> check that the max has converged.
>
>
> I just did, and something is off:
> I do multiple calls to SNESSolve (staggered scheme for phase-field
> fracture), but only get informations on the first solve (which is not the
> one failing, of course)
> Here is what I get:
> Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 7.636421712860e+01 % max 1.e+00 min
> 1.e+00 max/min 1.e+00
>   1 KSP Residual norm 3.402024867977e+01 % max 1.114319928921e+00 min
> 1.114319928921e+00 max/min 1.e+00
>   2 KSP Residual norm 2.124815079671e+01 % max 1.501143586520e+00 min
> 5.739351119078e-01 max/min 2.615528402732e+00
>   3 KSP Residual norm 1.581785698912e+01 % max 1.644351137983e+00 min
> 3.263683482596e-01 max/min 5.038329074347e+00
>   4 KSP Residual norm 1.254871990315e+01 % max 1.714668863819e+00 min
> 2.044075812142e-01 max/min 8.388479789416e+00
>   5 KSP Residual norm 1.051198229090e+01 % max 1.760078533063e+00 min
> 1.409327403114e-01 max/min 1.248878386367e+01
>   6 KSP Residual norm 9.061658306086e+00 % max 1.792995287686e+00 min
> 1.023484740555e-01 max/min 1.751853463603e+01
>   7 KSP Residual norm 8.015529297567e+00 % max 1.821497535985e+00 min
> 7.818018001928e-02 max/min 2.329871248104e+01
>   8 KSP Residual norm 7.201063258957e+00 % max 1.855140071935e+00 min
> 6.178572472468e-02 max/min 3.002538337458e+01
>   9 KSP Residual norm 6.548491711695e+00 % max 1.903578294573e+00 min
> 5.008612895206e-02 max/min 3.800609738466e+01
>  10 KSP Residual norm 6.002109992255e+00 % max 1.961356890125e+00 min
> 4.130572033722e-02 max/min 4.748390475004e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 2.373573910237e+02 % max 1.e+00 min
> 1.e+00 max/min 1.e+00
>   1 KSP Residual norm 8.845061415709e+01 % max 1.081192207576e+00 min
> 1.081192207576e+00 max/min 1.e+00
>   2 KSP Residual norm 5.607525485152e+01 % max 1.345947059840e+00 min
> 5.768825326129e-01 max/min 2.333138869267e+00
>   3 KSP Residual norm 4.123522550864e+01 % max 1.481153523075e+00 min
> 3.070603564913e-01 max/min 4.823655974348e+00
>   4 KSP Residual norm 3.345765664017e+01 % max 1.551374710727e+00 min
> 1.953487694959e-01 max/min 7.941563771968e+00
>   5 KSP Residual norm 2.859712984893e+01 % max 1.604588395452e+00 min
> 1.313871480574e-01 max/min 1.221267391199e+01
>   6 KSP Residual norm 2.525636054248e+01 % max 1.650487481750e+00 min
> 9.322735730688e-02 max/min 1.770389646804e+01
>   7 KSP Residual norm 2.270711391451e+01 % max 1.697243639599e+00 min
> 6.945419058256e-02 max/min 2.443687883140e+01
>   8 KSP Residual norm 2.074739485241e+01 % max 1.737293728907e+00 min
> 5.319942519758e-02 max/min 3.265624999621e+01
>   9 KSP Residual norm 1.912808268870e+01 % max 1.771708608618e+00 min
> 4.229776586667e-02 max/min 4.188657656771e+01
>  10 KSP Residual norm 1.787394414641e+01 % max 1.802834420843e+00 min
> 3.460455235448e-02 max/min 5.209818645753e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 1.361990679391e+03 % max 1.e+00 min
> 1.e+00 max/min 1.e+00
>   1 KSP Residual norm 5.377188333825e+02 % max 1.086812916769e+00 min
> 1.086812916769e+00 max/min 1.e+00
>   2 KSP Residual norm 2.819790765047e+02 % max 1.474233179517e+00 min
> 6.475176340551e-01 max/min 2.276745994212e+00
>   3 KSP Residual norm 1.856720658591e+02 % max 1.646049713883e+00 min
> 4.391851040105e-01 max/min 3.747963441500e+00
>   4 KSP Residual norm 1.446507859917e+02 % max 1.760403013135e+00 min
> 2.972886103795e-01 max/min 5.921528614526e+00
>   5 KSP Residual norm 1.212491636433e+02 % max 1.839250080524e+00 min
> 1.921591413785e-01 max/min 9.571494061277e+00
>   6 KSP Residual norm 1.052783637696e+02 % max 1.887062042760e+00 min
> 1.275920366984e-01 max/min 1.478981048966e+01
>   7 KSP 

Re: [petsc-users] Petsc DMLabel Fortran Stub request

2023-03-27 Thread Matthew Knepley
On Fri, Jan 6, 2023 at 10:03 AM Nicholas Arnold-Medabalimi <
narno...@umich.edu> wrote:

> Hi Petsc Users
>

I apologize. I found this email today and it looks like no one answered.


> I am trying to use the sequence of
> call DMLabelPropagateBegin(synchLabel,sf,ierr)
> call
> DMLabelPropagatePush(synchLabel,sf,PETSC_NULL_OPTIONS,PETSC_NULL_INTEGER,ierr)
> call DMLabelPropagateEnd(synchLabel,sf, ierr)
> in fortran.
>
> I apologize if I messed something up, it appears as if the
> DMLabelPropagatePush command doesn't have an appropriate Fortran interface
> as I get an undefined reference when it is called.
>

Yes, it takes a function pointer, and using function pointers with Fortran
is not easy, although it can be done. It might be better to create a C
function with some default marking and then wrap that. What do you want to
do?

  Thanks,

 Matt


> I would appreciate any assistance.
>
> As a side note in practice, what is the proper Fortran NULL pointer to use
> for void arguments? I used an integer one temporarily to get to the
> undefined reference error but I assume it doesn't matter?
>
>
> Sincerely
> Nicholas
>
> --
> Nicholas Arnold-Medabalimi
>
> Ph.D. Candidate
> Computational Aeroscience Lab
> University of Michigan
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] [petsc-maint] DMSwarm documentation

2023-03-27 Thread Matthew Knepley
On Mon, Mar 27, 2023 at 10:19 AM Joauma Marichal <
joauma.maric...@uclouvain.be> wrote:

> Hello,
>
>
>
> I am writing to you as I am trying to find documentation about a function
> that would remove several particles (given their index). I was using:
>
> DMSwarmRemovePointAtIndex(*swarm, to_remove[p]);
>
> But need something to remove several particles at one time.
>

There are no functions taking a list of points to remove.

  Thanks,

 Matt


> Petsc.org seems to be down and I was wondering if there was any other way
> to get this kind of information.
>
>
>
> Thanks a lot for your help.
>
> Best regards,
>
>
>
> Joauma
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] GAMG failure

2023-03-27 Thread Jed Brown
Try -pc_gamg_reuse_interpolation 0. I thought this was disabled by default, but 
I see pc_gamg->reuse_prol = PETSC_TRUE in the code.

Blaise Bourdin  writes:

>  On Mar 24, 2023, at 3:21 PM, Mark Adams  wrote:
>
>  * Do you set: 
>
>  PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));
>
>  PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));
>
> Yes
>
>  Do that to get CG Eigen estimates. Outright failure is usually caused by a 
> bad Eigen estimate.
>  -pc_gamg_esteig_ksp_monitor_singular_value
>  Will print out the estimates as its iterating. You can look at that to check 
> that the max has converged.
>
> I just did, and something is off:
> I do multiple calls to SNESSolve (staggered scheme for phase-field fracture), 
> but only get informations on the first solve (which is
> not the one failing, of course)
> Here is what I get:
> Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 7.636421712860e+01 % max 1.e+00 min 
> 1.e+00 max/min
> 1.e+00
>   1 KSP Residual norm 3.402024867977e+01 % max 1.114319928921e+00 min 
> 1.114319928921e+00 max/min
> 1.e+00
>   2 KSP Residual norm 2.124815079671e+01 % max 1.501143586520e+00 min 
> 5.739351119078e-01 max/min
> 2.615528402732e+00
>   3 KSP Residual norm 1.581785698912e+01 % max 1.644351137983e+00 min 
> 3.263683482596e-01 max/min
> 5.038329074347e+00
>   4 KSP Residual norm 1.254871990315e+01 % max 1.714668863819e+00 min 
> 2.044075812142e-01 max/min
> 8.388479789416e+00
>   5 KSP Residual norm 1.051198229090e+01 % max 1.760078533063e+00 min 
> 1.409327403114e-01 max/min
> 1.248878386367e+01
>   6 KSP Residual norm 9.061658306086e+00 % max 1.792995287686e+00 min 
> 1.023484740555e-01 max/min
> 1.751853463603e+01
>   7 KSP Residual norm 8.015529297567e+00 % max 1.821497535985e+00 min 
> 7.818018001928e-02 max/min
> 2.329871248104e+01
>   8 KSP Residual norm 7.201063258957e+00 % max 1.855140071935e+00 min 
> 6.178572472468e-02 max/min
> 3.002538337458e+01
>   9 KSP Residual norm 6.548491711695e+00 % max 1.903578294573e+00 min 
> 5.008612895206e-02 max/min
> 3.800609738466e+01
>  10 KSP Residual norm 6.002109992255e+00 % max 1.961356890125e+00 min 
> 4.130572033722e-02 max/min
> 4.748390475004e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 2.373573910237e+02 % max 1.e+00 min 
> 1.e+00 max/min
> 1.e+00
>   1 KSP Residual norm 8.845061415709e+01 % max 1.081192207576e+00 min 
> 1.081192207576e+00 max/min
> 1.e+00
>   2 KSP Residual norm 5.607525485152e+01 % max 1.345947059840e+00 min 
> 5.768825326129e-01 max/min
> 2.333138869267e+00
>   3 KSP Residual norm 4.123522550864e+01 % max 1.481153523075e+00 min 
> 3.070603564913e-01 max/min
> 4.823655974348e+00
>   4 KSP Residual norm 3.345765664017e+01 % max 1.551374710727e+00 min 
> 1.953487694959e-01 max/min
> 7.941563771968e+00
>   5 KSP Residual norm 2.859712984893e+01 % max 1.604588395452e+00 min 
> 1.313871480574e-01 max/min
> 1.221267391199e+01
>   6 KSP Residual norm 2.525636054248e+01 % max 1.650487481750e+00 min 
> 9.322735730688e-02 max/min
> 1.770389646804e+01
>   7 KSP Residual norm 2.270711391451e+01 % max 1.697243639599e+00 min 
> 6.945419058256e-02 max/min
> 2.443687883140e+01
>   8 KSP Residual norm 2.074739485241e+01 % max 1.737293728907e+00 min 
> 5.319942519758e-02 max/min
> 3.265624999621e+01
>   9 KSP Residual norm 1.912808268870e+01 % max 1.771708608618e+00 min 
> 4.229776586667e-02 max/min
> 4.188657656771e+01
>  10 KSP Residual norm 1.787394414641e+01 % max 1.802834420843e+00 min 
> 3.460455235448e-02 max/min
> 5.209818645753e+01
>   Residual norms for Displacement_pc_gamg_esteig_ solve.
>   0 KSP Residual norm 1.361990679391e+03 % max 1.e+00 min 
> 1.e+00 max/min
> 1.e+00
>   1 KSP Residual norm 5.377188333825e+02 % max 1.086812916769e+00 min 
> 1.086812916769e+00 max/min
> 1.e+00
>   2 KSP Residual norm 2.819790765047e+02 % max 1.474233179517e+00 min 
> 6.475176340551e-01 max/min
> 2.276745994212e+00
>   3 KSP Residual norm 1.856720658591e+02 % max 1.646049713883e+00 min 
> 4.391851040105e-01 max/min
> 3.747963441500e+00
>   4 KSP Residual norm 1.446507859917e+02 % max 1.760403013135e+00 min 
> 2.972886103795e-01 max/min
> 5.921528614526e+00
>   5 KSP Residual norm 1.212491636433e+02 % max 1.839250080524e+00 min 
> 1.921591413785e-01 max/min
> 9.571494061277e+00
>   6 KSP Residual norm 1.052783637696e+02 % max 1.887062042760e+00 min 
> 1.275920366984e-01 max/min
> 1.478981048966e+01
>   7 KSP Residual norm 9.230292625762e+01 % max 1.917891358356e+00 min 
> 8.853577120467e-02 max/min
> 2.166233300122e+01
>   8 KSP Residual norm 8.262607594297e+01 % max 1.935857204308e+00 min 
> 6.706949937710e-02 max/min
> 2.886345093206e+01
>   9 KSP Residual norm 7.616474911000e+01 % max 1.946323901431e+00 min 
> 5.354310733090e-02 max/min
> 3.635059671458e+01
>  10 KSP Residual norm 

Re: [petsc-users] GAMG failure

2023-03-27 Thread Blaise Bourdin






On Mar 24, 2023, at 3:21 PM, Mark Adams  wrote:



* Do you set:


    PetscCall(MatSetOption(Amat, MAT_SPD, PETSC_TRUE));








    PetscCall(MatSetOption(Amat, MAT_SPD_ETERNAL, PETSC_TRUE));







Yes







Do that to get CG Eigen estimates. Outright failure is usually caused by a bad Eigen estimate.
-pc_gamg_esteig_ksp_monitor_singular_value

Will print out the estimates as its iterating. You can look at that to check that the max has converged.






I just did, and something is off:
I do multiple calls to SNESSolve (staggered scheme for phase-field fracture), but only get informations on the first solve (which is not the one failing, of course)
Here is what I get:


Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 7.636421712860e+01 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 3.402024867977e+01 % max 1.114319928921e+00 min 1.114319928921e+00 max/min 1.e+00
  2 KSP Residual norm 2.124815079671e+01 % max 1.501143586520e+00 min 5.739351119078e-01 max/min 2.615528402732e+00
  3 KSP Residual norm 1.581785698912e+01 % max 1.644351137983e+00 min 3.263683482596e-01 max/min 5.038329074347e+00
  4 KSP Residual norm 1.254871990315e+01 % max 1.714668863819e+00 min 2.044075812142e-01 max/min 8.388479789416e+00
  5 KSP Residual norm 1.051198229090e+01 % max 1.760078533063e+00 min 1.409327403114e-01 max/min 1.248878386367e+01
  6 KSP Residual norm 9.061658306086e+00 % max 1.792995287686e+00 min 1.023484740555e-01 max/min 1.751853463603e+01
  7 KSP Residual norm 8.015529297567e+00 % max 1.821497535985e+00 min 7.818018001928e-02 max/min 2.329871248104e+01
  8 KSP Residual norm 7.201063258957e+00 % max 1.855140071935e+00 min 6.178572472468e-02 max/min 3.002538337458e+01
  9 KSP Residual norm 6.548491711695e+00 % max 1.903578294573e+00 min 5.008612895206e-02 max/min 3.800609738466e+01
 10 KSP Residual norm 6.002109992255e+00 % max 1.961356890125e+00 min 4.130572033722e-02 max/min 4.748390475004e+01
  Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 2.373573910237e+02 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 8.845061415709e+01 % max 1.081192207576e+00 min 1.081192207576e+00 max/min 1.e+00
  2 KSP Residual norm 5.607525485152e+01 % max 1.345947059840e+00 min 5.768825326129e-01 max/min 2.333138869267e+00
  3 KSP Residual norm 4.123522550864e+01 % max 1.481153523075e+00 min 3.070603564913e-01 max/min 4.823655974348e+00
  4 KSP Residual norm 3.345765664017e+01 % max 1.551374710727e+00 min 1.953487694959e-01 max/min 7.941563771968e+00
  5 KSP Residual norm 2.859712984893e+01 % max 1.604588395452e+00 min 1.313871480574e-01 max/min 1.221267391199e+01
  6 KSP Residual norm 2.525636054248e+01 % max 1.650487481750e+00 min 9.322735730688e-02 max/min 1.770389646804e+01
  7 KSP Residual norm 2.270711391451e+01 % max 1.697243639599e+00 min 6.945419058256e-02 max/min 2.443687883140e+01
  8 KSP Residual norm 2.074739485241e+01 % max 1.737293728907e+00 min 5.319942519758e-02 max/min 3.265624999621e+01
  9 KSP Residual norm 1.912808268870e+01 % max 1.771708608618e+00 min 4.229776586667e-02 max/min 4.188657656771e+01
 10 KSP Residual norm 1.787394414641e+01 % max 1.802834420843e+00 min 3.460455235448e-02 max/min 5.209818645753e+01
  Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 1.361990679391e+03 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 5.377188333825e+02 % max 1.086812916769e+00 min 1.086812916769e+00 max/min 1.e+00
  2 KSP Residual norm 2.819790765047e+02 % max 1.474233179517e+00 min 6.475176340551e-01 max/min 2.276745994212e+00
  3 KSP Residual norm 1.856720658591e+02 % max 1.646049713883e+00 min 4.391851040105e-01 max/min 3.747963441500e+00
  4 KSP Residual norm 1.446507859917e+02 % max 1.760403013135e+00 min 2.972886103795e-01 max/min 5.921528614526e+00
  5 KSP Residual norm 1.212491636433e+02 % max 1.839250080524e+00 min 1.921591413785e-01 max/min 9.571494061277e+00
  6 KSP Residual norm 1.052783637696e+02 % max 1.887062042760e+00 min 1.275920366984e-01 max/min 1.478981048966e+01
  7 KSP Residual norm 9.230292625762e+01 % max 1.917891358356e+00 min 8.853577120467e-02 max/min 2.166233300122e+01
  8 KSP Residual norm 8.262607594297e+01 % max 1.935857204308e+00 min 6.706949937710e-02 max/min 2.886345093206e+01
  9 KSP Residual norm 7.616474911000e+01 % max 1.946323901431e+00 min 5.354310733090e-02 max/min 3.635059671458e+01
 10 KSP Residual norm 7.138356892221e+01 % max 1.954382723686e+00 min 4.367661484659e-02 max/min 4.474666204216e+01
  Residual norms for Displacement_pc_gamg_esteig_ solve.
  0 KSP Residual norm 3.702300162209e+03 % max 1.e+00 min 1.e+00 max/min 1.e+00
  1 KSP Residual norm 1.255008322497e+03 % max 

Re: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device.

2023-03-27 Thread Fackler, Philip via petsc-users
Junchao,

I'm realizing I left you hanging in this email thread. Thank you so much for 
addressing the problem. I have tested it (successfully) using one process and 
one GPU. I'm still attempting to test with multiple GPUs (one per task) on 
another machine. I'll let you know if I see any more trouble.

Thanks again,

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory

From: Junchao Zhang 
Sent: Tuesday, February 7, 2023 16:26
To: Fackler, Philip 
Cc: xolotl-psi-developm...@lists.sourceforge.net 
; petsc-users@mcs.anl.gov 
; Blondel, Sophie ; Roth, Philip 

Subject: Re: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec 
diverging when running on CUDA device.

Hi, Philip,
  I believe this MR 
https://gitlab.com/petsc/petsc/-/merge_requests/6030
 would fix the problem.  It is a fix to petsc/release, but you can cherry-pick 
it to petsc/main.
  Could you try that in your case?
  Thanks.
--Junchao Zhang


On Fri, Jan 20, 2023 at 11:31 AM Junchao Zhang 
mailto:junchao.zh...@gmail.com>> wrote:
Sorry, no progress. I guess that is because a vector was gotten but not 
restored (e.g., VecRestoreArray() etc), causing host and device data not 
synced.  Maybe in your code, or in petsc code.
After the ECP AM, I will have more time on this bug.
Thanks.

--Junchao Zhang


On Fri, Jan 20, 2023 at 11:00 AM Fackler, Philip 
mailto:fackle...@ornl.gov>> wrote:
Any progress on this? Any info/help needed?

Thanks,

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory

From: Fackler, Philip mailto:fackle...@ornl.gov>>
Sent: Thursday, December 8, 2022 09:07
To: Junchao Zhang mailto:junchao.zh...@gmail.com>>
Cc: 
xolotl-psi-developm...@lists.sourceforge.net
 
mailto:xolotl-psi-developm...@lists.sourceforge.net>>;
 petsc-users@mcs.anl.gov 
mailto:petsc-users@mcs.anl.gov>>; Blondel, Sophie 
mailto:sblon...@utk.edu>>; Roth, Philip 
mailto:rot...@ornl.gov>>
Subject: Re: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec 
diverging when running on CUDA device.

Great! Thank you!

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory

From: Junchao Zhang mailto:junchao.zh...@gmail.com>>
Sent: Wednesday, December 7, 2022 18:47
To: Fackler, Philip mailto:fackle...@ornl.gov>>
Cc: 
xolotl-psi-developm...@lists.sourceforge.net
 
mailto:xolotl-psi-developm...@lists.sourceforge.net>>;
 petsc-users@mcs.anl.gov 
mailto:petsc-users@mcs.anl.gov>>; Blondel, Sophie 
mailto:sblon...@utk.edu>>; Roth, Philip 
mailto:rot...@ornl.gov>>
Subject: Re: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec 
diverging when running on CUDA device.

Hi, Philip,
 I could reproduce the error. I need to find a  way to debug it.  Thanks.

/home/jczhang/xolotl/test/system/SystemTestCase.cpp(317): fatal error: in 
"System/PSI_1": absolute value of diffNorm{0.19704848134353209} exceeds 1e-10
*** 1 failure is detected in the test module "Regression"

--Junchao Zhang


On Tue, Dec 6, 2022 at 10:10 AM Fackler, Philip 
mailto:fackle...@ornl.gov>> wrote:
I think it would be simpler to use the develop branch for this issue. But you 
can still just build the SystemTester. Then (if you changed the PSI_1 case) run:

 ./test/system/SystemTester -t System/PSI_1 -- -v​

(No need for multiple MPI ranks)

Thanks,

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory

From: Junchao Zhang mailto:junchao.zh...@gmail.com>>
Sent: Monday, December 5, 2022 15:40
To: Fackler, Philip mailto:fackle...@ornl.gov>>
Cc: 
xolotl-psi-developm...@lists.sourceforge.net
 
mailto:xolotl-psi-developm...@lists.sourceforge.net>>;
 petsc-users@mcs.anl.gov 
mailto:petsc-users@mcs.anl.gov>>; Blondel, Sophie 
mailto:sblon...@utk.edu>>; Roth, Philip 

Re: [petsc-users] [petsc-maint] DMSwarm documentation

2023-03-27 Thread Barry Smith

  petsc.org  can be flaky and hang for a few seconds or not 
respond occasionally but trying again should work.

  Barry


> On Mar 27, 2023, at 10:13 AM, Joauma Marichal  
> wrote:
> 
> Hello, 
>  
> I am writing to you as I am trying to find documentation about a function 
> that would remove several particles (given their index). I was using:
> DMSwarmRemovePointAtIndex(*swarm, to_remove[p]);
> But need something to remove several particles at one time.
>  
> Petsc.org  seems to be down and I was wondering if there 
> was any other way to get this kind of information.
>  
> Thanks a lot for your help. 
> 
> Best regards, 
>  
> Joauma



Re: [petsc-users] Using PETSc Testing System

2023-03-27 Thread Matthew Knepley
On Mon, Mar 27, 2023 at 10:19 AM Jacob Faibussowitsch 
wrote:

> Our testing framework was pretty much tailor-made for the PETSc src tree
> and as such has many hard-coded paths and decisions. I’m going to go out on
> a limb and say you probably won’t get this to work...
>

I think we can help you get this to work. I have wanted to generalize the
test framework for a long time. Everything is build by

  confg/gmakegentest.py

and I think we can get away with just changing paths here and everything
will work.

  Thanks!

 Matt


> That being said, one of the “base” paths that the testing harness uses to
> initially find tests is the `TESTSRCDIR` variable in
> `${PETSC_DIR}/gmakefile.test`. It is currently defined as
> ```
> # TESTSRCDIR is always relative to gmakefile.test
> #  This must be before includes
> mkfile_path := $(abspath $(lastword $(MAKEFILE_LIST)))
> TESTSRCDIR := $(dir $(mkfile_path))src
> ```
> You should start by changing this to
> ```
> # TESTSRCDIR is always relative to gmakefile.test
> #  This must be before includes
> mkfile_path := $(abspath $(lastword $(MAKEFILE_LIST)))
> TESTSRCDIR ?= $(dir $(mkfile_path))src
> ```
> That way you could run your tests via
> ```
> $ make test TESTSRCDIR=/path/to/your/src/dir
> ```
> I am sure there are many other modifications you will need to make.
>
> Best regards,
>
> Jacob Faibussowitsch
> (Jacob Fai - booss - oh - vitch)
>
> > On Mar 27, 2023, at 06:14, Daniele Prada 
> wrote:
> >
> > Hello everyone,
> >
> > I would like to use the PETSc Testing System for testing a package that
> I am developing.
> >
> > I have read the PETSc developer documentation and have written some
> tests using the PETSc Test Description Language. I am going through the
> files in ${PETSC_DIR}/config but I am not able to make the testing system
> look into the directory tree of my project.
> >
> > Any suggestions?
> >
> > Thanks in advance
> > Daniele
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


[petsc-users] DMSwarm documentation

2023-03-27 Thread Joauma Marichal
Hello,

I am writing to you as I am trying to find documentation about a function that 
would remove several particles (given their index). I was using:
DMSwarmRemovePointAtIndex(*swarm, to_remove[p]);
But need something to remove several particles at one time.

Petsc.org seems to be down and I was wondering if there was any other way to 
get this kind of information.

Thanks a lot for your help.

Best regards,

Joauma


Re: [petsc-users] Using PETSc Testing System

2023-03-27 Thread Jacob Faibussowitsch
Our testing framework was pretty much tailor-made for the PETSc src tree and as 
such has many hard-coded paths and decisions. I’m going to go out on a limb and 
say you probably won’t get this to work...

That being said, one of the “base” paths that the testing harness uses to 
initially find tests is the `TESTSRCDIR` variable in 
`${PETSC_DIR}/gmakefile.test`. It is currently defined as 
```
# TESTSRCDIR is always relative to gmakefile.test
#  This must be before includes
mkfile_path := $(abspath $(lastword $(MAKEFILE_LIST)))
TESTSRCDIR := $(dir $(mkfile_path))src
```
You should start by changing this to
```
# TESTSRCDIR is always relative to gmakefile.test
#  This must be before includes
mkfile_path := $(abspath $(lastword $(MAKEFILE_LIST)))
TESTSRCDIR ?= $(dir $(mkfile_path))src
```
That way you could run your tests via
```
$ make test TESTSRCDIR=/path/to/your/src/dir
```
I am sure there are many other modifications you will need to make.

Best regards,

Jacob Faibussowitsch
(Jacob Fai - booss - oh - vitch)

> On Mar 27, 2023, at 06:14, Daniele Prada  wrote:
> 
> Hello everyone,
> 
> I would like to use the PETSc Testing System for testing a package that I am 
> developing.
> 
> I have read the PETSc developer documentation and have written some tests 
> using the PETSc Test Description Language. I am going through the files in 
> ${PETSC_DIR}/config but I am not able to make the testing system look into 
> the directory tree of my project.
> 
> Any suggestions?
> 
> Thanks in advance
> Daniele



[petsc-users] Using PETSc Testing System

2023-03-27 Thread Daniele Prada
Hello everyone,

I would like to use the PETSc Testing System for testing a package that I
am developing.

I have read the PETSc developer documentation and have written some tests
using the PETSc Test Description Language. I am going through the files in
${PETSC_DIR}/config but I am not able to make the testing system look into
the directory tree of my project.

Any suggestions?

Thanks in advance
Daniele