Re: [petsc-users] How to use DM_BOUNDARY_GHOSTED for Dirichlet boundary conditions

2023-02-27 Thread Paul Grosse-Bley

The scaling might be the problem, especially since I don't know what you mean 
by scaling it according to FE.

For reproducing the issue with a smaller problem:
Change the ComputeRHS function in ex45.c

if (i == 0 || j == 0 || k == 0 || i == mx - 1 || j == my - 1 || k == mz - 1) {
  barray[k][j][i] = 0.0;
} else {
  barray[k][j][i] = 1.0;
}

Change the dimensions to e.g. 33 (I scaled it down, so it goes quick without a 
GPU) instead of 7 and then run with

-ksp_converged_reason -ksp_type richardson -ksp_rtol 1e-09 -pc_type mg 
-pc_mg_levels 3 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 6 
-mg_levels_ksp_converged_maxits -mg_levels_pc_type jacobi -mg_coarse_ksp_type 
richardson -mg_coarse_ksp_max_it 6 -mg_coarse_ksp_converged_maxits 
-mg_coarse_pc_type jacobi

You will find that it takes 145 iterations instead of 25 for the original ex45 
RHS. My hpgmg-cuda implementation (using 32^3) takes 41 iterations.

To what do I have to change the diagonal entries of the matrix for the boundary 
according to FE? Right now the diagonal is completely constant.

Paul

On Tuesday, February 28, 2023 00:23 CET, Barry Smith  wrote:
 
I have not seen explicitly including, or excluding, the Dirichlet boundary 
values in the system having a significant affect on the convergence so long as 
you SCALE the diagonal rows (of those Dirichlet points) by a value similar to 
the other entries along the diagonal. If they are scaled completely 
differently, that can screw up the convergence. For src/ksp/ksp/ex45.c I see 
that the appropriate scaling is used (note the scaling should come from a 
finite element view of the discretization even if the discretization is finite 
differences as is done in ex45.c)

Are you willing to share the two codes so we can take a look with experienced 
eyes to try to figure out the difference?

Barry




> On Feb 27, 2023, at 5:48 PM, Paul Grosse-Bley 
>  wrote:
>
> Hi Barry,
>
> the reason why I wanted to change to ghost boundaries is that I was worrying 
> about the effect of PCMGs coarsening on these boundary values.
>
> As mentioned before, I am trying to reproduce results from the hpgmg-cuda 
> benchmark (a modified version of it, e.g. using 2nd order instead of 4th 
> etc.).
> I am trying to solve the Poisson equation -\nabla^2 u = 1 with u = 0 on the 
> boundary with rtol=1e-9. While my MG solver implemented in hpgmg solves this 
> in 40 V-cycles (I weakened it a lot by only doing smooths at the coarse level 
> instead of CG). When I run the "same" MG solver built in PETSc on this 
> problem, it starts out reducing the residual norm as fast or even faster for 
> the first 20-30 iterations. But for the last order of magnitude in the 
> residual norm it needs more than 300 V-cycles, i.e. it gets very slow. At 
> this point I am pretty much out of ideas about what is the cause, especially 
> since e.g. adding back cg at the coarsest level doesn't seem to change the 
> number of iterations at all. Therefore I am suspecting the discretization to 
> be the problem. HPGMG uses an even number of points per dimension (e.g. 256), 
> while PCMG wants an odd number (e.g. 257). So I also tried adding another 
> layer of boundary values for the discretization to effectively use only 254 
> points per dimension. This caused the solver to get even slightly worse.
>
> So can the explicit boundary values screw with the coarsening, especially 
> when they are not finite? Because with the problem as stated in ex45 with 
> finite (i.e. non-zero) boundary values, the MG solver takes only 18 V-cycles.
>
> Best,
> Paul
>
>
>
> On Monday, February 27, 2023 18:17 CET, Barry Smith  wrote:
>
>>
>> Paul,
>>
>> DM_BOUNDARY_GHOSTED would result in the extra ghost locations in the local 
>> vectors (obtained with DMCreateLocalVector() but they will not appear in the 
>> global vectors obtained with DMCreateGlobalVector(); perhaps this is the 
>> issue? Since they do not appear in the global vector they will not appear in 
>> the linear system so there will be no diagonal entries for you to set since 
>> those rows/columns do not exist in the linear system. In other words, using 
>> DM_BOUNDARY_GHOSTED is a way to avoid needing to put the Dirichlet values 
>> explicitly into the system being solved; DM_BOUNDARY_GHOSTED is generally 
>> more helpful for nonlinear systems than linear systems.
>>
>> Barry
>>
>> > On Feb 27, 2023, at 12:08 PM, Paul Grosse-Bley 
>> >  wrote:
>> >
>> > Hi,
>> >
>> > I would like to modify src/ksp/ksp/tutorials/ex45.c to implement Dirichlet 
>> > boundary conditions using DM_BOUNDARY_GHOSTED instead of using 
>> > DM_BOUNDARY_NONE and explicitly implementing the boundary by adding 
>> > diagnonal-only rows.
>> >
>> > My assumption was that with DM_BOUNDARY_GHOSTED all vectors from that DM 
>> > have the extra memory for the ghost entries and that I can basically use 
>> > DMDAGetGhostCorners instead of DMDAGetCorners to access the array gotten 
>> > via DMDAVecGetArray. But when I access 

Re: [petsc-users] How to use DM_BOUNDARY_GHOSTED for Dirichlet boundary conditions

2023-02-27 Thread Barry Smith


  I have not seen explicitly including, or excluding, the Dirichlet boundary 
values in the system having a significant affect on the convergence so long as 
you SCALE the diagonal rows (of those Dirichlet points) by a value similar to 
the other entries along the diagonal. If they are scaled completely 
differently, that can screw up the convergence. For src/ksp/ksp/ex45.c I see 
that the appropriate scaling is used (note the scaling should come from a 
finite element view of the discretization even if the discretization is finite 
differences as is done in ex45.c)

   Are you willing to share the two codes so we can take a look with 
experienced eyes to try to figure out the difference?

  Barry




> On Feb 27, 2023, at 5:48 PM, Paul Grosse-Bley 
>  wrote:
> 
> Hi Barry,
> 
> the reason why I wanted to change to ghost boundaries is that I was worrying 
> about the effect of PCMGs coarsening on these boundary values.
> 
> As mentioned before, I am trying to reproduce results from the hpgmg-cuda 
> benchmark (a modified version of it, e.g. using 2nd order instead of 4th 
> etc.).
> I am trying to solve the Poisson equation -\nabla^2 u = 1 with u = 0 on the 
> boundary with rtol=1e-9. While my MG solver implemented in hpgmg solves this 
> in 40 V-cycles (I weakened it a lot by only doing smooths at the coarse level 
> instead of CG). When I run the "same" MG solver built in PETSc on this 
> problem, it starts out reducing the residual norm as fast or even faster for 
> the first 20-30 iterations. But for the last order of magnitude in the 
> residual norm it needs more than 300 V-cycles, i.e. it gets very slow. At 
> this point I am pretty much out of ideas about what is the cause, especially 
> since e.g. adding back cg at the coarsest level doesn't seem to change the 
> number of iterations at all. Therefore I am suspecting the discretization to 
> be the problem. HPGMG uses an even number of points per dimension (e.g. 256), 
> while PCMG wants an odd number (e.g. 257). So I also tried adding another 
> layer of boundary values for the discretization to effectively use only 254 
> points per dimension. This caused the solver to get even slightly worse.
> 
> So can the explicit boundary values screw with the coarsening, especially 
> when they are not finite? Because with the problem as stated in ex45 with 
> finite (i.e. non-zero) boundary values, the MG solver takes only 18 V-cycles.
> 
> Best,
> Paul
> 
> 
> 
> On Monday, February 27, 2023 18:17 CET, Barry Smith  wrote:
>  
>> 
>> Paul,
>> 
>> DM_BOUNDARY_GHOSTED would result in the extra ghost locations in the local 
>> vectors (obtained with DMCreateLocalVector() but they will not appear in the 
>> global vectors obtained with DMCreateGlobalVector(); perhaps this is the 
>> issue? Since they do not appear in the global vector they will not appear in 
>> the linear system so there will be no diagonal entries for you to set since 
>> those rows/columns do not exist in the linear system. In other words, using 
>> DM_BOUNDARY_GHOSTED is a way to avoid needing to put the Dirichlet values 
>> explicitly into the system being solved; DM_BOUNDARY_GHOSTED is generally 
>> more helpful for nonlinear systems than linear systems.
>> 
>> Barry
>> 
>> > On Feb 27, 2023, at 12:08 PM, Paul Grosse-Bley 
>> >  wrote:
>> >
>> > Hi,
>> >
>> > I would like to modify src/ksp/ksp/tutorials/ex45.c to implement Dirichlet 
>> > boundary conditions using DM_BOUNDARY_GHOSTED instead of using 
>> > DM_BOUNDARY_NONE and explicitly implementing the boundary by adding 
>> > diagnonal-only rows.
>> >
>> > My assumption was that with DM_BOUNDARY_GHOSTED all vectors from that DM 
>> > have the extra memory for the ghost entries and that I can basically use 
>> > DMDAGetGhostCorners instead of DMDAGetCorners to access the array gotten 
>> > via DMDAVecGetArray. But when I access (gxs, gys, gzs) = (-1,-1,-1) I get 
>> > a segmentation fault. When looking at the implementation of 
>> > DMDAVecGetArray it looked to me as if accessing (-1, -1, -1) should work 
>> > as DMDAVecGetArray passes the ghost corners to VecGetArray3d which then 
>> > adds the right offsets.
>> >
>> > I could not find any example using DM_BOUNDARY_GHOSTED and then actually 
>> > accessing the ghost/boundary elements. Can I assume that they are set to 
>> > zero for the solution vector, i.e. the u=0 on \del\Omega and I do not need 
>> > to access them at all?
>> >
>> > Best,
>> > Paul Große-Bley
>>  



Re: [petsc-users] How to use DM_BOUNDARY_GHOSTED for Dirichlet boundary conditions

2023-02-27 Thread Paul Grosse-Bley

Hi Barry,

the reason why I wanted to change to ghost boundaries is that I was worrying 
about the effect of PCMGs coarsening on these boundary values.

As mentioned before, I am trying to reproduce results from the hpgmg-cuda 
benchmark (a modified version of it, e.g. using 2nd order instead of 4th etc.).
I am trying to solve the Poisson equation -\nabla^2 u = 1 with u = 0 on the 
boundary with rtol=1e-9. While my MG solver implemented in hpgmg solves this in 
40 V-cycles (I weakened it a lot by only doing smooths at the coarse level 
instead of CG). When I run the "same" MG solver built in PETSc on this problem, 
it starts out reducing the residual norm as fast or even faster for the first 
20-30 iterations. But for the last order of magnitude in the residual norm it 
needs more than 300 V-cycles, i.e. it gets very slow. At this point I am pretty 
much out of ideas about what is the cause, especially since e.g. adding back cg 
at the coarsest level doesn't seem to change the number of iterations at all. 
Therefore I am suspecting the discretization to be the problem. HPGMG uses an 
even number of points per dimension (e.g. 256), while PCMG wants an odd number 
(e.g. 257). So I also tried adding another layer of boundary values for the 
discretization to effectively use only 254 points per dimension. This caused 
the solver to get even slightly worse.

So can the explicit boundary values screw with the coarsening, especially when 
they are not finite? Because with the problem as stated in ex45 with finite 
(i.e. non-zero) boundary values, the MG solver takes only 18 V-cycles.

Best,
Paul



On Monday, February 27, 2023 18:17 CET, Barry Smith  wrote:
 Paul,

DM_BOUNDARY_GHOSTED would result in the extra ghost locations in the local 
vectors (obtained with DMCreateLocalVector() but they will not appear in the 
global vectors obtained with DMCreateGlobalVector(); perhaps this is the issue? 
Since they do not appear in the global vector they will not appear in the 
linear system so there will be no diagonal entries for you to set since those 
rows/columns do not exist in the linear system. In other words, using 
DM_BOUNDARY_GHOSTED is a way to avoid needing to put the Dirichlet values 
explicitly into the system being solved; DM_BOUNDARY_GHOSTED is generally more 
helpful for nonlinear systems than linear systems.

Barry

> On Feb 27, 2023, at 12:08 PM, Paul Grosse-Bley 
>  wrote:
>
> Hi,
>
> I would like to modify src/ksp/ksp/tutorials/ex45.c to implement Dirichlet 
> boundary conditions using DM_BOUNDARY_GHOSTED instead of using 
> DM_BOUNDARY_NONE and explicitly implementing the boundary by adding 
> diagnonal-only rows.
>
> My assumption was that with DM_BOUNDARY_GHOSTED all vectors from that DM have 
> the extra memory for the ghost entries and that I can basically use 
> DMDAGetGhostCorners instead of DMDAGetCorners to access the array gotten via 
> DMDAVecGetArray. But when I access (gxs, gys, gzs) = (-1,-1,-1) I get a 
> segmentation fault. When looking at the implementation of DMDAVecGetArray it 
> looked to me as if accessing (-1, -1, -1) should work as DMDAVecGetArray 
> passes the ghost corners to VecGetArray3d which then adds the right offsets.
>
> I could not find any example using DM_BOUNDARY_GHOSTED and then actually 
> accessing the ghost/boundary elements. Can I assume that they are set to zero 
> for the solution vector, i.e. the u=0 on \del\Omega and I do not need to 
> access them at all?
>
> Best,
> Paul Große-Bley
 


Re: [petsc-users] How to use DM_BOUNDARY_GHOSTED for Dirichlet boundary conditions

2023-02-27 Thread Barry Smith
Paul,

DM_BOUNDARY_GHOSTED  would result in the extra ghost locations in the local 
vectors (obtained with DMCreateLocalVector() but they will not appear in the 
global vectors obtained with DMCreateGlobalVector(); perhaps this is the issue? 
Since they do not appear in the global vector they will not appear in the 
linear system so there will be no diagonal entries for you to set since those 
rows/columns do not exist in the linear system.  In other words, using 
DM_BOUNDARY_GHOSTED is a way to avoid needing to put the Dirichlet values 
explicitly into the system being solved; DM_BOUNDARY_GHOSTED is generally more 
helpful for nonlinear systems than linear systems.

Barry

> On Feb 27, 2023, at 12:08 PM, Paul Grosse-Bley 
>  wrote:
> 
> Hi,
> 
> I would like to modify src/ksp/ksp/tutorials/ex45.c to implement Dirichlet 
> boundary conditions using DM_BOUNDARY_GHOSTED instead of using 
> DM_BOUNDARY_NONE and explicitly implementing the boundary by adding 
> diagnonal-only rows.
> 
> My assumption was that with DM_BOUNDARY_GHOSTED all vectors from that DM have 
> the extra memory for the ghost entries and that I can basically use 
> DMDAGetGhostCorners instead of DMDAGetCorners to access the array gotten via 
> DMDAVecGetArray. But when I access (gxs, gys, gzs) = (-1,-1,-1) I get a 
> segmentation fault. When looking at the implementation of DMDAVecGetArray it 
> looked to me as if accessing (-1, -1, -1) should work as DMDAVecGetArray 
> passes the ghost corners to VecGetArray3d which then adds the right offsets.
> 
> I could not find any example using DM_BOUNDARY_GHOSTED and then actually 
> accessing the ghost/boundary elements. Can I assume that they are set to zero 
> for the solution vector, i.e. the u=0 on \del\Omega and I do not need to 
> access them at all?
> 
> Best,
> Paul Große-Bley



[petsc-users] How to use DM_BOUNDARY_GHOSTED for Dirichlet boundary conditions

2023-02-27 Thread Paul Grosse-Bley

Hi,

I would like to modify src/ksp/ksp/tutorials/ex45.c to implement Dirichlet 
boundary conditions using DM_BOUNDARY_GHOSTED instead of using DM_BOUNDARY_NONE 
and explicitly implementing the boundary by adding diagnonal-only rows.

My assumption was that with DM_BOUNDARY_GHOSTED all vectors from that DM have 
the extra memory for the ghost entries and that I can basically use 
DMDAGetGhostCorners instead of DMDAGetCorners to access the array gotten via 
DMDAVecGetArray. But when I access (gxs, gys, gzs) = (-1,-1,-1) I get a 
segmentation fault. When looking at the implementation of DMDAVecGetArray it 
looked to me as if accessing (-1, -1, -1) should work as DMDAVecGetArray passes 
the ghost corners to VecGetArray3d which then adds the right offsets.

I could not find any example using DM_BOUNDARY_GHOSTED and then actually 
accessing the ghost/boundary elements. Can I assume that they are set to zero 
for the solution vector, i.e. the u=0 on \del\Omega and I do not need to access 
them at all?

Best,
Paul Große-Bley


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Pierre Jolivet


> On 27 Feb 2023, at 4:42 PM, Matthew Knepley  wrote:
> 
> On Mon, Feb 27, 2023 at 10:26 AM Pierre Jolivet  > wrote:
>>> On 27 Feb 2023, at 4:16 PM, Matthew Knepley >> > wrote:
>>> 
>>> On Mon, Feb 27, 2023 at 10:13 AM Pierre Jolivet >> > wrote:
> On 27 Feb 2023, at 3:59 PM, Matthew Knepley  > wrote:
> 
> On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang  > wrote:
>> Hi, Matt,
>> 
>> I tested coarsening a mesh by using ParMMg without firedrake, and found 
>> some issues:
>>  see the code and results here:  
>> https://gitlab.com/petsc/petsc/-/issues/1331
>> 
>> Could you have a look and give some comments or suggestions?
> 
> I replied on the issue. More generally, the adaptive refinement software 
> has not seen wide use
 
 :)
 Matt probably meant “the _DMPlex interface_ to adaptive refinement 
 software has not seen wide use”, Mmg has been rather widely used for 10+ 
 years (here is a 13-year old presentation 
 https://www.ljll.math.upmc.fr/hecht/ftp/ff++days/2010/exposes/Morice-MeshMetric.pdf).
>>> 
>>> The interface is certainly new, but even ParMMG is only from Nov 2016, 
>>> which is very new if you are an old person :)
>> 
>> Indeed. In fact, I do believe we should add a DMPlex mechanism to centralize 
>> (redistribute on a single process) a DMPlex and to call Mmg instead of 
>> ParMmg.
>> It would certainly not be scalable for large meshes but:
>> 1) there is no need for ParMmg on small-/medium-scale meshes
>> 2) Mmg is more robust than ParMmg at this point in time
>> 3) Mmg has more feature than ParMmg at this point in time, e.g., implicit 
>> remeshing using a level-set
>> 4) there is more industry money funnelled into Mmg than into ParMmg 
>> I think the mechanism I mentioned initially was in the TODO list of the 
>> Firedrake people (or yours?), maybe it’s already done, but in any case it’s 
>> not hooked in the Mmg adaptor code, though it should (erroring out in the 
>> case where the communicator is of size greater than one would then not 
>> happen anymore).
> 
> Yes, we used to do the same thing with partitioners. We can use 
> DMPlexGather().
> 
> I thought MMG only did 2D and ParMMG only did 3D, but this must be wrong now. 
> Can MMG do both?

Mmg does 2D, 3D, and 3D surfaces.
ParMmg only does 3D (with no short-term plan for 2D or 3D surfaces).

Thanks,
Pierre

>   Thanks,
> 
>  Matt
>  
>> Thanks,
>> Pierre
>> 
>>>   Thanks,
>>> 
>>> Matt
>>>  
 Thanks,
 Pierre
 
> , and I expect
> more of these kinds of bugs until more people use it.
> 
>   Thanks,
> 
>  Matt
>  
>> Best wishes,
>> Zongze
>> 
>> 
>> On Mon, 27 Feb 2023 at 20:19, Matthew Knepley > > wrote:
>>> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang >> > wrote:
 Another question on mesh coarsening is about `DMCoarsen` which will 
 fail when running in parallel.
 
 I generate a mesh in Firedrake, and then create function space and 
 functions, after that, I get the dmplex and coarsen it.
 When running in serials, I get the mesh coarsened correctly. But it 
 failed with errors in ParMMG when running parallel.
 
 However, If I did not create function space and functions on the 
 original mesh, everything works fine too.
 
 The code and the error logs are attached.
>>> 
>>> I believe the problem is that Firedrake and PETSc currently have 
>>> incompatible coordinate spaces. We are working
>>> to fix this, and I expect it to work by this summer.
>>> 
>>>   Thanks,
>>> 
>>>  Matt
>>>  
 Thank you for your time and attention。
 
 Best wishes,
 Zongze
 
 
 On Sat, 18 Feb 2023 at 15:24, Zongze Yang >>> > wrote:
> Dear PETSc Group,
> 
> I am writing to inquire about the function DMAdaptLabel in PETSc. 
> I am trying to use it coarse a mesh, but the resulting mesh is 
> refined.
> 
> In the following code, all of the `adpat` label values were set to 2 
> (DM_ADAPT_COARSEN).
> There must be something wrong. Could you give some suggestions?
>  
> ```python
> from firedrake import *
> from firedrake.petsc import PETSc
> 
> def mark_all_cells(mesh):
> plex = mesh.topology_dm
> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
> plex.createLabel('adapt')
> cs, ce = plex.getHeightStratum(0)
> for i in range(cs, ce):
> plex.setLabelValue('adapt', i, 2)
> 
>   

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Matthew Knepley
On Mon, Feb 27, 2023 at 10:26 AM Pierre Jolivet  wrote:

> On 27 Feb 2023, at 4:16 PM, Matthew Knepley  wrote:
>
> On Mon, Feb 27, 2023 at 10:13 AM Pierre Jolivet  wrote:
>
>> On 27 Feb 2023, at 3:59 PM, Matthew Knepley  wrote:
>>
>> On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang  wrote:
>>
>>> Hi, Matt,
>>>
>>> I tested coarsening a mesh by using ParMMg without firedrake, and found
>>> some issues:
>>>  see the code and results here:
>>> https://gitlab.com/petsc/petsc/-/issues/1331
>>>
>>> Could you have a look and give some comments or suggestions?
>>>
>>
>> I replied on the issue. More generally, the adaptive refinement software
>> has not seen wide use
>>
>>
>> :)
>> Matt probably meant “the _DMPlex interface_ to adaptive refinement
>> software has not seen wide use”, Mmg has been rather widely used for 10+
>> years (here is a 13-year old presentation
>> https://www.ljll.math.upmc.fr/hecht/ftp/ff++days/2010/exposes/Morice-MeshMetric.pdf
>> ).
>>
>
> The interface is certainly new, but even ParMMG is only from Nov 2016,
> which is very new if you are an old person :)
>
>
> Indeed. In fact, I do believe we should add a DMPlex mechanism to
> centralize (redistribute on a single process) a DMPlex and to call Mmg
> instead of ParMmg.
> It would certainly not be scalable for large meshes but:
> 1) there is no need for ParMmg on small-/medium-scale meshes
> 2) Mmg is more robust than ParMmg at this point in time
> 3) Mmg has more feature than ParMmg at this point in time, e.g., implicit
> remeshing using a level-set
> 4) there is more industry money funnelled into Mmg than into ParMmg
> I think the mechanism I mentioned initially was in the TODO list of the
> Firedrake people (or yours?), maybe it’s already done, but in any case it’s
> not hooked in the Mmg adaptor code, though it should (erroring out in the
> case where the communicator is of size greater than one would then not
> happen anymore).
>

Yes, we used to do the same thing with partitioners. We can use
DMPlexGather().

I thought MMG only did 2D and ParMMG only did 3D, but this must be wrong
now. Can MMG do both?

  Thanks,

 Matt


> Thanks,
> Pierre
>
>   Thanks,
>
> Matt
>
>
>> Thanks,
>> Pierre
>>
>> , and I expect
>> more of these kinds of bugs until more people use it.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Best wishes,
>>> Zongze
>>>
>>>
>>> On Mon, 27 Feb 2023 at 20:19, Matthew Knepley  wrote:
>>>
 On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang 
 wrote:

> Another question on mesh coarsening is about `DMCoarsen` which will
> fail when running in parallel.
>
> I generate a mesh in Firedrake, and then create function space and
> functions, after that, I get the dmplex and coarsen it.
> When running in serials, I get the mesh coarsened correctly. But it
> failed with errors in ParMMG when running parallel.
>
> However, If I did not create function space and functions on the
> original mesh, everything works fine too.
>
> The code and the error logs are attached.
>

 I believe the problem is that Firedrake and PETSc currently have
 incompatible coordinate spaces. We are working
 to fix this, and I expect it to work by this summer.

   Thanks,

  Matt


> Thank you for your time and attention。
>
> Best wishes,
> Zongze
>
>
> On Sat, 18 Feb 2023 at 15:24, Zongze Yang 
> wrote:
>
>> Dear PETSc Group,
>>
>> I am writing to inquire about the function DMAdaptLabel in PETSc.
>> I am trying to use it coarse a mesh, but the resulting mesh is
>> refined.
>>
>> In the following code, all of the `adpat` label values were set to 2
>> (DM_ADAPT_COARSEN).
>> There must be something wrong. Could you give some suggestions?
>>
>> ```python
>> from firedrake import *
>> from firedrake.petsc import PETSc
>>
>> def mark_all_cells(mesh):
>> plex = mesh.topology_dm
>> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
>> plex.createLabel('adapt')
>> cs, ce = plex.getHeightStratum(0)
>> for i in range(cs, ce):
>> plex.setLabelValue('adapt', i, 2)
>>
>> return plex
>>
>> mesh = RectangleMesh(10, 10, 1, 1)
>>
>> x = SpatialCoordinate(mesh)
>> V = FunctionSpace(mesh, 'CG', 1)
>> f = Function(V).interpolate(10 + 10*sin(x[0]))
>> triplot(mesh)
>>
>> plex = mark_all_cells(mesh)
>> new_plex = plex.adaptLabel('adapt')
>> mesh = Mesh(new_plex)
>> triplot(mesh)
>> ```
>>
>> Thank you very much for your time.
>>
>> Best wishes,
>> Zongze
>>
>

 --
 What most experimenters take for granted before they begin their
 experiments is infinitely more interesting than any results to which their
 experiments lead.
 -- Norbert Wiener

 https://www.cse.buffalo.edu/~knepley/
 

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Zongze Yang
Yes, It seems that firedrake only works with DMPlex. Thanks.

Best wishes,
Zongze


On Mon, 27 Feb 2023 at 22:53, Matthew Knepley  wrote:

> On Mon, Feb 27, 2023 at 9:45 AM Zongze Yang  wrote:
>
>> Hi, Matt
>>
>> Thanks for your clarification. Can I change the type of DMPlex to
>> DMForest?
>>
>
> You can, however DMForest is for structured adaptive meshes using
> quadtrees, and I do not believe
> Firedrake works with that.
>
>   Thanks,
>
> Matt
>
>
>> Best wishes,
>> Zongze
>>
>>
>> On Mon, 27 Feb 2023 at 20:18, Matthew Knepley  wrote:
>>
>>> On Sat, Feb 18, 2023 at 2:25 AM Zongze Yang 
>>> wrote:
>>>
 Dear PETSc Group,

 I am writing to inquire about the function DMAdaptLabel in PETSc.
 I am trying to use it coarse a mesh, but the resulting mesh is refined.

 In the following code, all of the `adpat` label values were set to 2
 (DM_ADAPT_COARSEN).
 There must be something wrong. Could you give some suggestions?

>>>
>>> Sorry for the late reply. You are right, I need to put in error messages
>>> for this. Here is what is happening.
>>> PETSc tries to fallback if you do not have certain packages. In this
>>> case, you are not using DMForest,
>>> which responds to both coarsen and refine, so the
>>> mesh generator interprets all markers as refine (they
>>> cannot coarsen). I will add a check that fails on the coarsen marker.
>>>
>>> Coarsening is much more difficult in the presence of boundaries, which
>>> is why it is not implemented in
>>> most packages. For unstructured coarsening, I do not think there is any
>>> choice but MMG.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>> ```python
 from firedrake import *
 from firedrake.petsc import PETSc

 def mark_all_cells(mesh):
 plex = mesh.topology_dm
 with PETSc.Log.Event("ADD_ADAPT_LABEL"):
 plex.createLabel('adapt')
 cs, ce = plex.getHeightStratum(0)
 for i in range(cs, ce):
 plex.setLabelValue('adapt', i, 2)

 return plex

 mesh = RectangleMesh(10, 10, 1, 1)

 x = SpatialCoordinate(mesh)
 V = FunctionSpace(mesh, 'CG', 1)
 f = Function(V).interpolate(10 + 10*sin(x[0]))
 triplot(mesh)

 plex = mark_all_cells(mesh)
 new_plex = plex.adaptLabel('adapt')
 mesh = Mesh(new_plex)
 triplot(mesh)
 ```

 Thank you very much for your time.

 Best wishes,
 Zongze

>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> 
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Pierre Jolivet


> On 27 Feb 2023, at 4:16 PM, Matthew Knepley  wrote:
> 
> On Mon, Feb 27, 2023 at 10:13 AM Pierre Jolivet  > wrote:
>>> On 27 Feb 2023, at 3:59 PM, Matthew Knepley >> > wrote:
>>> 
>>> On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang >> > wrote:
 Hi, Matt,
 
 I tested coarsening a mesh by using ParMMg without firedrake, and found 
 some issues:
  see the code and results here:  
 https://gitlab.com/petsc/petsc/-/issues/1331
 
 Could you have a look and give some comments or suggestions?
>>> 
>>> I replied on the issue. More generally, the adaptive refinement software 
>>> has not seen wide use
>> 
>> :)
>> Matt probably meant “the _DMPlex interface_ to adaptive refinement software 
>> has not seen wide use”, Mmg has been rather widely used for 10+ years (here 
>> is a 13-year old presentation 
>> https://www.ljll.math.upmc.fr/hecht/ftp/ff++days/2010/exposes/Morice-MeshMetric.pdf).
> 
> The interface is certainly new, but even ParMMG is only from Nov 2016, which 
> is very new if you are an old person :)

Indeed. In fact, I do believe we should add a DMPlex mechanism to centralize 
(redistribute on a single process) a DMPlex and to call Mmg instead of ParMmg.
It would certainly not be scalable for large meshes but:
1) there is no need for ParMmg on small-/medium-scale meshes
2) Mmg is more robust than ParMmg at this point in time
3) Mmg has more feature than ParMmg at this point in time, e.g., implicit 
remeshing using a level-set
4) there is more industry money funnelled into Mmg than into ParMmg 
I think the mechanism I mentioned initially was in the TODO list of the 
Firedrake people (or yours?), maybe it’s already done, but in any case it’s not 
hooked in the Mmg adaptor code, though it should (erroring out in the case 
where the communicator is of size greater than one would then not happen 
anymore).

Thanks,
Pierre

>   Thanks,
> 
> Matt
>  
>> Thanks,
>> Pierre
>> 
>>> , and I expect
>>> more of these kinds of bugs until more people use it.
>>> 
>>>   Thanks,
>>> 
>>>  Matt
>>>  
 Best wishes,
 Zongze
 
 
 On Mon, 27 Feb 2023 at 20:19, Matthew Knepley >>> > wrote:
> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang  > wrote:
>> Another question on mesh coarsening is about `DMCoarsen` which will fail 
>> when running in parallel.
>> 
>> I generate a mesh in Firedrake, and then create function space and 
>> functions, after that, I get the dmplex and coarsen it.
>> When running in serials, I get the mesh coarsened correctly. But it 
>> failed with errors in ParMMG when running parallel.
>> 
>> However, If I did not create function space and functions on the 
>> original mesh, everything works fine too.
>> 
>> The code and the error logs are attached.
> 
> I believe the problem is that Firedrake and PETSc currently have 
> incompatible coordinate spaces. We are working
> to fix this, and I expect it to work by this summer.
> 
>   Thanks,
> 
>  Matt
>  
>> Thank you for your time and attention。
>> 
>> Best wishes,
>> Zongze
>> 
>> 
>> On Sat, 18 Feb 2023 at 15:24, Zongze Yang > > wrote:
>>> Dear PETSc Group,
>>> 
>>> I am writing to inquire about the function DMAdaptLabel in PETSc. 
>>> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>>> 
>>> In the following code, all of the `adpat` label values were set to 2 
>>> (DM_ADAPT_COARSEN).
>>> There must be something wrong. Could you give some suggestions?
>>>  
>>> ```python
>>> from firedrake import *
>>> from firedrake.petsc import PETSc
>>> 
>>> def mark_all_cells(mesh):
>>> plex = mesh.topology_dm
>>> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
>>> plex.createLabel('adapt')
>>> cs, ce = plex.getHeightStratum(0)
>>> for i in range(cs, ce):
>>> plex.setLabelValue('adapt', i, 2)
>>> 
>>> return plex
>>> 
>>> mesh = RectangleMesh(10, 10, 1, 1)
>>> 
>>> x = SpatialCoordinate(mesh)
>>> V = FunctionSpace(mesh, 'CG', 1)
>>> f = Function(V).interpolate(10 + 10*sin(x[0]))
>>> triplot(mesh)
>>> 
>>> plex = mark_all_cells(mesh)
>>> new_plex = plex.adaptLabel('adapt')
>>> mesh = Mesh(new_plex)
>>> triplot(mesh)
>>> ```
>>> 
>>> Thank you very much for your time.
>>> 
>>> Best wishes,
>>> Zongze
> 
> 
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ 
> 

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Matthew Knepley
On Mon, Feb 27, 2023 at 10:13 AM Pierre Jolivet  wrote:

> On 27 Feb 2023, at 3:59 PM, Matthew Knepley  wrote:
>
> On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang  wrote:
>
>> Hi, Matt,
>>
>> I tested coarsening a mesh by using ParMMg without firedrake, and found
>> some issues:
>>  see the code and results here:
>> https://gitlab.com/petsc/petsc/-/issues/1331
>>
>> Could you have a look and give some comments or suggestions?
>>
>
> I replied on the issue. More generally, the adaptive refinement software
> has not seen wide use
>
>
> :)
> Matt probably meant “the _DMPlex interface_ to adaptive refinement
> software has not seen wide use”, Mmg has been rather widely used for 10+
> years (here is a 13-year old presentation
> https://www.ljll.math.upmc.fr/hecht/ftp/ff++days/2010/exposes/Morice-MeshMetric.pdf
> ).
>

The interface is certainly new, but even ParMMG is only from Nov 2016,
which is very new if you are an old person :)

  Thanks,

Matt


> Thanks,
> Pierre
>
> , and I expect
> more of these kinds of bugs until more people use it.
>
>   Thanks,
>
>  Matt
>
>
>> Best wishes,
>> Zongze
>>
>>
>> On Mon, 27 Feb 2023 at 20:19, Matthew Knepley  wrote:
>>
>>> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang 
>>> wrote:
>>>
 Another question on mesh coarsening is about `DMCoarsen` which will
 fail when running in parallel.

 I generate a mesh in Firedrake, and then create function space and
 functions, after that, I get the dmplex and coarsen it.
 When running in serials, I get the mesh coarsened correctly. But it
 failed with errors in ParMMG when running parallel.

 However, If I did not create function space and functions on the
 original mesh, everything works fine too.

 The code and the error logs are attached.

>>>
>>> I believe the problem is that Firedrake and PETSc currently have
>>> incompatible coordinate spaces. We are working
>>> to fix this, and I expect it to work by this summer.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
 Thank you for your time and attention。

 Best wishes,
 Zongze


 On Sat, 18 Feb 2023 at 15:24, Zongze Yang  wrote:

> Dear PETSc Group,
>
> I am writing to inquire about the function DMAdaptLabel in PETSc.
> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>
> In the following code, all of the `adpat` label values were set to 2
> (DM_ADAPT_COARSEN).
> There must be something wrong. Could you give some suggestions?
>
> ```python
> from firedrake import *
> from firedrake.petsc import PETSc
>
> def mark_all_cells(mesh):
> plex = mesh.topology_dm
> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
> plex.createLabel('adapt')
> cs, ce = plex.getHeightStratum(0)
> for i in range(cs, ce):
> plex.setLabelValue('adapt', i, 2)
>
> return plex
>
> mesh = RectangleMesh(10, 10, 1, 1)
>
> x = SpatialCoordinate(mesh)
> V = FunctionSpace(mesh, 'CG', 1)
> f = Function(V).interpolate(10 + 10*sin(x[0]))
> triplot(mesh)
>
> plex = mark_all_cells(mesh)
> new_plex = plex.adaptLabel('adapt')
> mesh = Mesh(new_plex)
> triplot(mesh)
> ```
>
> Thank you very much for your time.
>
> Best wishes,
> Zongze
>

>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> 
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Pierre Jolivet


> On 27 Feb 2023, at 3:59 PM, Matthew Knepley  wrote:
> 
> On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang  > wrote:
>> Hi, Matt,
>> 
>> I tested coarsening a mesh by using ParMMg without firedrake, and found some 
>> issues:
>>  see the code and results here:  https://gitlab.com/petsc/petsc/-/issues/1331
>> 
>> Could you have a look and give some comments or suggestions?
> 
> I replied on the issue. More generally, the adaptive refinement software has 
> not seen wide use

:)
Matt probably meant “the _DMPlex interface_ to adaptive refinement software has 
not seen wide use”, Mmg has been rather widely used for 10+ years (here is a 
13-year old presentation 
https://www.ljll.math.upmc.fr/hecht/ftp/ff++days/2010/exposes/Morice-MeshMetric.pdf).

Thanks,
Pierre

> , and I expect
> more of these kinds of bugs until more people use it.
> 
>   Thanks,
> 
>  Matt
>  
>> Best wishes,
>> Zongze
>> 
>> 
>> On Mon, 27 Feb 2023 at 20:19, Matthew Knepley > > wrote:
>>> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang >> > wrote:
 Another question on mesh coarsening is about `DMCoarsen` which will fail 
 when running in parallel.
 
 I generate a mesh in Firedrake, and then create function space and 
 functions, after that, I get the dmplex and coarsen it.
 When running in serials, I get the mesh coarsened correctly. But it failed 
 with errors in ParMMG when running parallel.
 
 However, If I did not create function space and functions on the original 
 mesh, everything works fine too.
 
 The code and the error logs are attached.
>>> 
>>> I believe the problem is that Firedrake and PETSc currently have 
>>> incompatible coordinate spaces. We are working
>>> to fix this, and I expect it to work by this summer.
>>> 
>>>   Thanks,
>>> 
>>>  Matt
>>>  
 Thank you for your time and attention。
 
 Best wishes,
 Zongze
 
 
 On Sat, 18 Feb 2023 at 15:24, Zongze Yang >>> > wrote:
> Dear PETSc Group,
> 
> I am writing to inquire about the function DMAdaptLabel in PETSc. 
> I am trying to use it coarse a mesh, but the resulting mesh is refined.
> 
> In the following code, all of the `adpat` label values were set to 2 
> (DM_ADAPT_COARSEN).
> There must be something wrong. Could you give some suggestions?
>  
> ```python
> from firedrake import *
> from firedrake.petsc import PETSc
> 
> def mark_all_cells(mesh):
> plex = mesh.topology_dm
> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
> plex.createLabel('adapt')
> cs, ce = plex.getHeightStratum(0)
> for i in range(cs, ce):
> plex.setLabelValue('adapt', i, 2)
> 
> return plex
> 
> mesh = RectangleMesh(10, 10, 1, 1)
> 
> x = SpatialCoordinate(mesh)
> V = FunctionSpace(mesh, 'CG', 1)
> f = Function(V).interpolate(10 + 10*sin(x[0]))
> triplot(mesh)
> 
> plex = mark_all_cells(mesh)
> new_plex = plex.adaptLabel('adapt')
> mesh = Mesh(new_plex)
> triplot(mesh)
> ```
> 
> Thank you very much for your time.
> 
> Best wishes,
> Zongze
>>> 
>>> 
>>> -- 
>>> What most experimenters take for granted before they begin their 
>>> experiments is infinitely more interesting than any results to which their 
>>> experiments lead.
>>> -- Norbert Wiener
>>> 
>>> https://www.cse.buffalo.edu/~knepley/ 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Matthew Knepley
On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang  wrote:

> Hi, Matt,
>
> I tested coarsening a mesh by using ParMMg without firedrake, and found
> some issues:
>  see the code and results here:
> https://gitlab.com/petsc/petsc/-/issues/1331
>
> Could you have a look and give some comments or suggestions?
>

I replied on the issue. More generally, the adaptive refinement software
has not seen wide use, and I expect
more of these kinds of bugs until more people use it.

  Thanks,

 Matt


> Best wishes,
> Zongze
>
>
> On Mon, 27 Feb 2023 at 20:19, Matthew Knepley  wrote:
>
>> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang  wrote:
>>
>>> Another question on mesh coarsening is about `DMCoarsen` which will fail
>>> when running in parallel.
>>>
>>> I generate a mesh in Firedrake, and then create function space and
>>> functions, after that, I get the dmplex and coarsen it.
>>> When running in serials, I get the mesh coarsened correctly. But it
>>> failed with errors in ParMMG when running parallel.
>>>
>>> However, If I did not create function space and functions on the
>>> original mesh, everything works fine too.
>>>
>>> The code and the error logs are attached.
>>>
>>
>> I believe the problem is that Firedrake and PETSc currently have
>> incompatible coordinate spaces. We are working
>> to fix this, and I expect it to work by this summer.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Thank you for your time and attention。
>>>
>>> Best wishes,
>>> Zongze
>>>
>>>
>>> On Sat, 18 Feb 2023 at 15:24, Zongze Yang  wrote:
>>>
 Dear PETSc Group,

 I am writing to inquire about the function DMAdaptLabel in PETSc.
 I am trying to use it coarse a mesh, but the resulting mesh is refined.

 In the following code, all of the `adpat` label values were set to 2
 (DM_ADAPT_COARSEN).
 There must be something wrong. Could you give some suggestions?

 ```python
 from firedrake import *
 from firedrake.petsc import PETSc

 def mark_all_cells(mesh):
 plex = mesh.topology_dm
 with PETSc.Log.Event("ADD_ADAPT_LABEL"):
 plex.createLabel('adapt')
 cs, ce = plex.getHeightStratum(0)
 for i in range(cs, ce):
 plex.setLabelValue('adapt', i, 2)

 return plex

 mesh = RectangleMesh(10, 10, 1, 1)

 x = SpatialCoordinate(mesh)
 V = FunctionSpace(mesh, 'CG', 1)
 f = Function(V).interpolate(10 + 10*sin(x[0]))
 triplot(mesh)

 plex = mark_all_cells(mesh)
 new_plex = plex.adaptLabel('adapt')
 mesh = Mesh(new_plex)
 triplot(mesh)
 ```

 Thank you very much for your time.

 Best wishes,
 Zongze

>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Matthew Knepley
On Mon, Feb 27, 2023 at 9:45 AM Zongze Yang  wrote:

> Hi, Matt
>
> Thanks for your clarification. Can I change the type of DMPlex to DMForest?
>

You can, however DMForest is for structured adaptive meshes using
quadtrees, and I do not believe
Firedrake works with that.

  Thanks,

Matt


> Best wishes,
> Zongze
>
>
> On Mon, 27 Feb 2023 at 20:18, Matthew Knepley  wrote:
>
>> On Sat, Feb 18, 2023 at 2:25 AM Zongze Yang  wrote:
>>
>>> Dear PETSc Group,
>>>
>>> I am writing to inquire about the function DMAdaptLabel in PETSc.
>>> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>>>
>>> In the following code, all of the `adpat` label values were set to 2
>>> (DM_ADAPT_COARSEN).
>>> There must be something wrong. Could you give some suggestions?
>>>
>>
>> Sorry for the late reply. You are right, I need to put in error messages
>> for this. Here is what is happening.
>> PETSc tries to fallback if you do not have certain packages. In this
>> case, you are not using DMForest,
>> which responds to both coarsen and refine, so the
>> mesh generator interprets all markers as refine (they
>> cannot coarsen). I will add a check that fails on the coarsen marker.
>>
>> Coarsening is much more difficult in the presence of boundaries, which is
>> why it is not implemented in
>> most packages. For unstructured coarsening, I do not think there is any
>> choice but MMG.
>>
>>   Thanks,
>>
>>  Matt
>>
>> ```python
>>> from firedrake import *
>>> from firedrake.petsc import PETSc
>>>
>>> def mark_all_cells(mesh):
>>> plex = mesh.topology_dm
>>> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
>>> plex.createLabel('adapt')
>>> cs, ce = plex.getHeightStratum(0)
>>> for i in range(cs, ce):
>>> plex.setLabelValue('adapt', i, 2)
>>>
>>> return plex
>>>
>>> mesh = RectangleMesh(10, 10, 1, 1)
>>>
>>> x = SpatialCoordinate(mesh)
>>> V = FunctionSpace(mesh, 'CG', 1)
>>> f = Function(V).interpolate(10 + 10*sin(x[0]))
>>> triplot(mesh)
>>>
>>> plex = mark_all_cells(mesh)
>>> new_plex = plex.adaptLabel('adapt')
>>> mesh = Mesh(new_plex)
>>> triplot(mesh)
>>> ```
>>>
>>> Thank you very much for your time.
>>>
>>> Best wishes,
>>> Zongze
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Zongze Yang
Hi, Matt,

I tested coarsening a mesh by using ParMMg without firedrake, and found
some issues:
 see the code and results here:
https://gitlab.com/petsc/petsc/-/issues/1331

Could you have a look and give some comments or suggestions?

Best wishes,
Zongze


On Mon, 27 Feb 2023 at 20:19, Matthew Knepley  wrote:

> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang  wrote:
>
>> Another question on mesh coarsening is about `DMCoarsen` which will fail
>> when running in parallel.
>>
>> I generate a mesh in Firedrake, and then create function space and
>> functions, after that, I get the dmplex and coarsen it.
>> When running in serials, I get the mesh coarsened correctly. But it
>> failed with errors in ParMMG when running parallel.
>>
>> However, If I did not create function space and functions on the original
>> mesh, everything works fine too.
>>
>> The code and the error logs are attached.
>>
>
> I believe the problem is that Firedrake and PETSc currently have
> incompatible coordinate spaces. We are working
> to fix this, and I expect it to work by this summer.
>
>   Thanks,
>
>  Matt
>
>
>> Thank you for your time and attention。
>>
>> Best wishes,
>> Zongze
>>
>>
>> On Sat, 18 Feb 2023 at 15:24, Zongze Yang  wrote:
>>
>>> Dear PETSc Group,
>>>
>>> I am writing to inquire about the function DMAdaptLabel in PETSc.
>>> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>>>
>>> In the following code, all of the `adpat` label values were set to 2
>>> (DM_ADAPT_COARSEN).
>>> There must be something wrong. Could you give some suggestions?
>>>
>>> ```python
>>> from firedrake import *
>>> from firedrake.petsc import PETSc
>>>
>>> def mark_all_cells(mesh):
>>> plex = mesh.topology_dm
>>> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
>>> plex.createLabel('adapt')
>>> cs, ce = plex.getHeightStratum(0)
>>> for i in range(cs, ce):
>>> plex.setLabelValue('adapt', i, 2)
>>>
>>> return plex
>>>
>>> mesh = RectangleMesh(10, 10, 1, 1)
>>>
>>> x = SpatialCoordinate(mesh)
>>> V = FunctionSpace(mesh, 'CG', 1)
>>> f = Function(V).interpolate(10 + 10*sin(x[0]))
>>> triplot(mesh)
>>>
>>> plex = mark_all_cells(mesh)
>>> new_plex = plex.adaptLabel('adapt')
>>> mesh = Mesh(new_plex)
>>> triplot(mesh)
>>> ```
>>>
>>> Thank you very much for your time.
>>>
>>> Best wishes,
>>> Zongze
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Zongze Yang
Hi, Matt

Thanks for your clarification. Can I change the type of DMPlex to DMForest?

Best wishes,
Zongze


On Mon, 27 Feb 2023 at 20:18, Matthew Knepley  wrote:

> On Sat, Feb 18, 2023 at 2:25 AM Zongze Yang  wrote:
>
>> Dear PETSc Group,
>>
>> I am writing to inquire about the function DMAdaptLabel in PETSc.
>> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>>
>> In the following code, all of the `adpat` label values were set to 2
>> (DM_ADAPT_COARSEN).
>> There must be something wrong. Could you give some suggestions?
>>
>
> Sorry for the late reply. You are right, I need to put in error messages
> for this. Here is what is happening.
> PETSc tries to fallback if you do not have certain packages. In this case,
> you are not using DMForest,
> which responds to both coarsen and refine, so the
> mesh generator interprets all markers as refine (they
> cannot coarsen). I will add a check that fails on the coarsen marker.
>
> Coarsening is much more difficult in the presence of boundaries, which is
> why it is not implemented in
> most packages. For unstructured coarsening, I do not think there is any
> choice but MMG.
>
>   Thanks,
>
>  Matt
>
> ```python
>> from firedrake import *
>> from firedrake.petsc import PETSc
>>
>> def mark_all_cells(mesh):
>> plex = mesh.topology_dm
>> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
>> plex.createLabel('adapt')
>> cs, ce = plex.getHeightStratum(0)
>> for i in range(cs, ce):
>> plex.setLabelValue('adapt', i, 2)
>>
>> return plex
>>
>> mesh = RectangleMesh(10, 10, 1, 1)
>>
>> x = SpatialCoordinate(mesh)
>> V = FunctionSpace(mesh, 'CG', 1)
>> f = Function(V).interpolate(10 + 10*sin(x[0]))
>> triplot(mesh)
>>
>> plex = mark_all_cells(mesh)
>> new_plex = plex.adaptLabel('adapt')
>> mesh = Mesh(new_plex)
>> triplot(mesh)
>> ```
>>
>> Thank you very much for your time.
>>
>> Best wishes,
>> Zongze
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] petsc compiled without MPI

2023-02-27 Thread Matthew Knepley
On Mon, Feb 27, 2023 at 8:41 AM Long, Jianbo  wrote:

> Thanks for the explanations ! It turns out the issue of running
> sequentially compiled petsc is PetscFinalize() function. Since my
> subroutine involving petsc functions needs to be called multiple times in
> the program, I have to comment out PetscFinalize() at the end of the
> subroutine, otherwise at the next call of this subroutine, petsc would stop
> and throw out an error about MPI_Comm_set_errhandler !
>

Yes, you are supposed to call PetscInitialize() _once_ at the beginning of
the program, and PetscFinalize() _once_ at the end of the program.

  Thanks,

Matt


> Jianbo
>
> On Sun, Feb 26, 2023 at 4:39 PM Satish Balay  wrote:
>
>> On Sun, 26 Feb 2023, Pierre Jolivet wrote:
>>
>> >
>> >
>> > > On 25 Feb 2023, at 11:44 PM, Long, Jianbo  wrote:
>> > >
>> > > Hello,
>> > >
>> > > For some of my applications, I need to use petsc without mpi, or use
>> it sequentially. I wonder where I can find examples/tutorials for this ?
>> >
>> > You can run sequentially with just a single MPI process (-n 1).
>>
>> even if you build with mpich/openmpi - you can run sequentially without
>> mpiexec - i.e:
>>
>> ./binary
>>
>> One reason to do this [instead of building PETSc with --with-mpi=0] - is
>> if you are mixing in multiple pkgs that have MPI dependencies [in which
>> case - its best to build all these pkgs with the same mpich or openmpi -
>> but still run sequentially].
>>
>> Satish
>>
>> > If you need to run without MPI whatsoever, you’ll need to have a
>> separate PETSc installation which was configured --with-mpi=0
>> > In both cases, the same user-code will run, i.e., all PETSc examples
>> available with the sources will work (though some are designed purely for
>> parallel experiments and may error out early on purpose).
>> >
>> > Thanks,
>> > Pierre
>> >
>> > > Thanks very much,
>> > > Jianbo Long
>> >
>> >
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] petsc compiled without MPI

2023-02-27 Thread Long, Jianbo
Thanks for the explanations ! It turns out the issue of running
sequentially compiled petsc is PetscFinalize() function. Since my
subroutine involving petsc functions needs to be called multiple times in
the program, I have to comment out PetscFinalize() at the end of the
subroutine, otherwise at the next call of this subroutine, petsc would stop
and throw out an error about MPI_Comm_set_errhandler !

Jianbo

On Sun, Feb 26, 2023 at 4:39 PM Satish Balay  wrote:

> On Sun, 26 Feb 2023, Pierre Jolivet wrote:
>
> >
> >
> > > On 25 Feb 2023, at 11:44 PM, Long, Jianbo  wrote:
> > >
> > > Hello,
> > >
> > > For some of my applications, I need to use petsc without mpi, or use
> it sequentially. I wonder where I can find examples/tutorials for this ?
> >
> > You can run sequentially with just a single MPI process (-n 1).
>
> even if you build with mpich/openmpi - you can run sequentially without
> mpiexec - i.e:
>
> ./binary
>
> One reason to do this [instead of building PETSc with --with-mpi=0] - is
> if you are mixing in multiple pkgs that have MPI dependencies [in which
> case - its best to build all these pkgs with the same mpich or openmpi -
> but still run sequentially].
>
> Satish
>
> > If you need to run without MPI whatsoever, you’ll need to have a
> separate PETSc installation which was configured --with-mpi=0
> > In both cases, the same user-code will run, i.e., all PETSc examples
> available with the sources will work (though some are designed purely for
> parallel experiments and may error out early on purpose).
> >
> > Thanks,
> > Pierre
> >
> > > Thanks very much,
> > > Jianbo Long
> >
> >
>


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Matthew Knepley
On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang  wrote:

> Another question on mesh coarsening is about `DMCoarsen` which will fail
> when running in parallel.
>
> I generate a mesh in Firedrake, and then create function space and
> functions, after that, I get the dmplex and coarsen it.
> When running in serials, I get the mesh coarsened correctly. But it failed
> with errors in ParMMG when running parallel.
>
> However, If I did not create function space and functions on the original
> mesh, everything works fine too.
>
> The code and the error logs are attached.
>

I believe the problem is that Firedrake and PETSc currently have
incompatible coordinate spaces. We are working
to fix this, and I expect it to work by this summer.

  Thanks,

 Matt


> Thank you for your time and attention。
>
> Best wishes,
> Zongze
>
>
> On Sat, 18 Feb 2023 at 15:24, Zongze Yang  wrote:
>
>> Dear PETSc Group,
>>
>> I am writing to inquire about the function DMAdaptLabel in PETSc.
>> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>>
>> In the following code, all of the `adpat` label values were set to 2
>> (DM_ADAPT_COARSEN).
>> There must be something wrong. Could you give some suggestions?
>>
>> ```python
>> from firedrake import *
>> from firedrake.petsc import PETSc
>>
>> def mark_all_cells(mesh):
>> plex = mesh.topology_dm
>> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
>> plex.createLabel('adapt')
>> cs, ce = plex.getHeightStratum(0)
>> for i in range(cs, ce):
>> plex.setLabelValue('adapt', i, 2)
>>
>> return plex
>>
>> mesh = RectangleMesh(10, 10, 1, 1)
>>
>> x = SpatialCoordinate(mesh)
>> V = FunctionSpace(mesh, 'CG', 1)
>> f = Function(V).interpolate(10 + 10*sin(x[0]))
>> triplot(mesh)
>>
>> plex = mark_all_cells(mesh)
>> new_plex = plex.adaptLabel('adapt')
>> mesh = Mesh(new_plex)
>> triplot(mesh)
>> ```
>>
>> Thank you very much for your time.
>>
>> Best wishes,
>> Zongze
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Matthew Knepley
On Sat, Feb 18, 2023 at 2:25 AM Zongze Yang  wrote:

> Dear PETSc Group,
>
> I am writing to inquire about the function DMAdaptLabel in PETSc.
> I am trying to use it coarse a mesh, but the resulting mesh is refined.
>
> In the following code, all of the `adpat` label values were set to 2
> (DM_ADAPT_COARSEN).
> There must be something wrong. Could you give some suggestions?
>

Sorry for the late reply. You are right, I need to put in error messages
for this. Here is what is happening.
PETSc tries to fallback if you do not have certain packages. In this case,
you are not using DMForest,
which responds to both coarsen and refine, so the
mesh generator interprets all markers as refine (they
cannot coarsen). I will add a check that fails on the coarsen marker.

Coarsening is much more difficult in the presence of boundaries, which is
why it is not implemented in
most packages. For unstructured coarsening, I do not think there is any
choice but MMG.

  Thanks,

 Matt

```python
> from firedrake import *
> from firedrake.petsc import PETSc
>
> def mark_all_cells(mesh):
> plex = mesh.topology_dm
> with PETSc.Log.Event("ADD_ADAPT_LABEL"):
> plex.createLabel('adapt')
> cs, ce = plex.getHeightStratum(0)
> for i in range(cs, ce):
> plex.setLabelValue('adapt', i, 2)
>
> return plex
>
> mesh = RectangleMesh(10, 10, 1, 1)
>
> x = SpatialCoordinate(mesh)
> V = FunctionSpace(mesh, 'CG', 1)
> f = Function(V).interpolate(10 + 10*sin(x[0]))
> triplot(mesh)
>
> plex = mark_all_cells(mesh)
> new_plex = plex.adaptLabel('adapt')
> mesh = Mesh(new_plex)
> triplot(mesh)
> ```
>
> Thank you very much for your time.
>
> Best wishes,
> Zongze
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/