Re: [petsc-users] MatConvert changes distribution of local rows

2023-01-12 Thread Pierre Jolivet

> On 13 Jan 2023, at 8:49 AM, Marius Buerkle  wrote:
> 
> Hi,
>  
> I have a matrix A for which I defined the number of local rows per process 
> manually using MatSetSizes. When I use MatConvert to change the matrix type 
> it changes the number of local rows (to what one would get if MatSetSize is 
> called with PETSC_DECIDE for number of local rows), which causes problems 
> when doing MatVec producs and stuff like that. Is there any way to preserve 
> the the number of local rows when using MatConvert?

This is most likely a bug, it’s not handled properly in some MatConvert() 
implementations.
Could you please share either the matrix types or a minimal working example?

Thanks,
Pierre

>  
> Best,
> Marius



[petsc-users] MatConvert changes distribution of local rows

2023-01-12 Thread Marius Buerkle
Hi,

 

I have a matrix A for which I defined the number of local rows per process manually using MatSetSizes. When I use MatConvert to change the matrix type it changes the number of local rows (to what one would get if MatSetSize is called with PETSC_DECIDE for number of local rows), which causes problems when doing MatVec producs and stuff like that. Is there any way to preserve the the number of local rows when using MatConvert?

 

Best,

Marius


Re: [petsc-users] PetscSF Fortran interface

2023-01-12 Thread Junchao Zhang
How about this?

PetscInt, pointer :: remoteOffsets(:)
call PetscSFCreateRemoteOffsetsf90(distributionSF, section_filt_l,
leafSection, remoteoffsets, ierr )
call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
remoteoffsets, leafSection, distributionSF_dof, ierr)
call PetscIntArray1dDestroyF90(remoteOffsets,ierr) // free remoteoffsets
when not needed

On Thu, Jan 12, 2023 at 11:11 PM Nicholas Arnold-Medabalimi <
narno...@umich.edu> wrote:

> That is what I tried initially however, I get a segmentation fault. I can
> confirm it's due to the remote offsets because if I try and output
> remoteoffsets between the Distribute Section and Create Section it throws
> the same segmentation fault.
>
> Thanks for the help
> Nicholas
>
> On Thu, Jan 12, 2023 at 11:56 PM Junchao Zhang 
> wrote:
>
>>
>>
>> On Thu, Jan 12, 2023 at 1:28 PM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Hi Junchao
>>>
>>> Going back to this merge request. I'm not sure I follow exactly the
>>> usage described in the commit history. I have a c prototype of what I am
>>> trying to do which is
>>>
>>> PetscSectionCreate(PETSC_COMM_WORLD, );
>>> PetscSFDistributeSection(redistributionSF, filteredSection_local, 
>>> ,
>>> leafSection);
>>> PetscSFCreateSectionSF(redistributionSF, filteredSection_local,
>>> remoteOffsets, leafSection, _dof);
>>>
>>> But something seems unclear with the usage in fortran around the
>>> remoteoffsets. Do I have to insert the CreateRemoteOffsetsF90 like so? Any
>>> clarification would be greatly appreciated.
>>>
>>> call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
>>> remoteoffsets, leafSection, ierr)
>>> call PetscSFCreateRemoteOffsetsf90(distributionSF, section_filt_l,
>>> leafSection, remoteoffsets, ierr )
>>> call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
>>> remoteoffsets, leafSection, distributionSF_dof, ierr)
>>>
>>> Hi, Nicholas,
>>   Reading through comments at
>> https://gitlab.com/petsc/petsc/-/merge_requests/5386#note_1022942470, I
>> feel it should look like
>>
>> PetscInt, pointer :: remoteOffsets(:)
>> call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
>> remoteoffsets, leafSection, ierr)  // allocate remoteoffsets
>> call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
>> remoteoffsets, leafSection, distributionSF_dof, ierr)
>> call PetscIntArray1dDestroyF90(remoteOffsets,ierr) // free remoteoffsets
>> when not needed
>>
>> Could you try it?
>>
>>
>> Sincerely
>>> Nicholas
>>>
>>> On Tue, Jan 10, 2023 at 4:42 PM Junchao Zhang 
>>> wrote:
>>>
 Hi, Nicholas,
It seems we have implemented it, but with another name,
 PetscSFCreateSectionSFF90, see
 https://gitlab.com/petsc/petsc/-/merge_requests/5386
Try it to see if it works!

 --Junchao Zhang


 On Tue, Jan 10, 2023 at 11:45 AM Nicholas Arnold-Medabalimi <
 narno...@umich.edu> wrote:

> Er to be honest I still can't get my stub to compile properly, and I
> don't know how to go about making a merge request. But here is what I am
> attempting right now. Let me know how best to proceed
>
>
> Its not exactly clear to me how to setup up the remote offset
> properly.
>
> in src/vec/is/sf/interface/ftn-custom/zsf.c
>
> PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection
> *rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
> PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO(remoteoffsetsd))
> {
>
>   int * remoteOffsets;
>   *ierr = F90Array1dAccess(aremoteOffsets, PETSC_INT, (void**)
>  PETSC_F90_2PTR_PARAM(remoteoffsetsd));if (*ierr) return;
>   *ierr = PetscSFCreateSectionSF(*sf,*rootSection,
> ,*leafSection,*sectionSF);if (*ierr) return;
>
> }
>
> This is the sticking point.
>
> Sincerely
> Nicholas
>
>
> On Tue, Jan 10, 2023 at 12:38 PM Junchao Zhang <
> junchao.zh...@gmail.com> wrote:
>
>> Hi, Nicholas,
>>   Could you make a merge request to PETSc and then our Fortran
>> experts can comment on your MR?
>>   Thanks.
>>
>> --Junchao Zhang
>>
>>
>> On Tue, Jan 10, 2023 at 11:10 AM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Hi Junchao
>>>
>>> I think I'm almost there, but I could use some insight into how to
>>> use the PETSC_F90_2PTR_PROTO and F90Array1dAccess for the remoteOffset
>>> parameter input so if another function comes up, I can add it myself
>>> without wasting your time.
>>> I am very grateful for your help and time.
>>>
>>> Sincerely
>>> Nicholas
>>>
>>> On Tue, Jan 10, 2023 at 10:55 AM Junchao Zhang <
>>> junchao.zh...@gmail.com> wrote:
>>>
 Hi, Nicholas,
I am not a fortran guy, but I will try to add
 petscsfcreatesectionsf.

Thanks.
 --Junchao Zhang


Re: [petsc-users] PetscSF Fortran interface

2023-01-12 Thread Nicholas Arnold-Medabalimi
That is what I tried initially however, I get a segmentation fault. I can
confirm it's due to the remote offsets because if I try and output
remoteoffsets between the Distribute Section and Create Section it throws
the same segmentation fault.

Thanks for the help
Nicholas

On Thu, Jan 12, 2023 at 11:56 PM Junchao Zhang 
wrote:

>
>
> On Thu, Jan 12, 2023 at 1:28 PM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Hi Junchao
>>
>> Going back to this merge request. I'm not sure I follow exactly the usage
>> described in the commit history. I have a c prototype of what I am trying
>> to do which is
>>
>> PetscSectionCreate(PETSC_COMM_WORLD, );
>> PetscSFDistributeSection(redistributionSF, filteredSection_local, 
>> ,
>> leafSection);
>> PetscSFCreateSectionSF(redistributionSF, filteredSection_local,
>> remoteOffsets, leafSection, _dof);
>>
>> But something seems unclear with the usage in fortran around the
>> remoteoffsets. Do I have to insert the CreateRemoteOffsetsF90 like so? Any
>> clarification would be greatly appreciated.
>>
>> call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
>> remoteoffsets, leafSection, ierr)
>> call PetscSFCreateRemoteOffsetsf90(distributionSF, section_filt_l,
>> leafSection, remoteoffsets, ierr )
>> call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
>> remoteoffsets, leafSection, distributionSF_dof, ierr)
>>
>> Hi, Nicholas,
>   Reading through comments at
> https://gitlab.com/petsc/petsc/-/merge_requests/5386#note_1022942470, I
> feel it should look like
>
> PetscInt, pointer :: remoteOffsets(:)
> call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
> remoteoffsets, leafSection, ierr)  // allocate remoteoffsets
> call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
> remoteoffsets, leafSection, distributionSF_dof, ierr)
> call PetscIntArray1dDestroyF90(remoteOffsets,ierr) // free remoteoffsets
> when not needed
>
> Could you try it?
>
>
> Sincerely
>> Nicholas
>>
>> On Tue, Jan 10, 2023 at 4:42 PM Junchao Zhang 
>> wrote:
>>
>>> Hi, Nicholas,
>>>It seems we have implemented it, but with another name,
>>> PetscSFCreateSectionSFF90, see
>>> https://gitlab.com/petsc/petsc/-/merge_requests/5386
>>>Try it to see if it works!
>>>
>>> --Junchao Zhang
>>>
>>>
>>> On Tue, Jan 10, 2023 at 11:45 AM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 Er to be honest I still can't get my stub to compile properly, and I
 don't know how to go about making a merge request. But here is what I am
 attempting right now. Let me know how best to proceed


 Its not exactly clear to me how to setup up the remote offset properly.

 in src/vec/is/sf/interface/ftn-custom/zsf.c

 PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection
 *rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
 PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO(remoteoffsetsd))
 {

   int * remoteOffsets;
   *ierr = F90Array1dAccess(aremoteOffsets, PETSC_INT, (void**)
  PETSC_F90_2PTR_PARAM(remoteoffsetsd));if (*ierr) return;
   *ierr = PetscSFCreateSectionSF(*sf,*rootSection,
 ,*leafSection,*sectionSF);if (*ierr) return;

 }

 This is the sticking point.

 Sincerely
 Nicholas


 On Tue, Jan 10, 2023 at 12:38 PM Junchao Zhang 
 wrote:

> Hi, Nicholas,
>   Could you make a merge request to PETSc and then our Fortran experts
> can comment on your MR?
>   Thanks.
>
> --Junchao Zhang
>
>
> On Tue, Jan 10, 2023 at 11:10 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Hi Junchao
>>
>> I think I'm almost there, but I could use some insight into how to
>> use the PETSC_F90_2PTR_PROTO and F90Array1dAccess for the remoteOffset
>> parameter input so if another function comes up, I can add it myself
>> without wasting your time.
>> I am very grateful for your help and time.
>>
>> Sincerely
>> Nicholas
>>
>> On Tue, Jan 10, 2023 at 10:55 AM Junchao Zhang <
>> junchao.zh...@gmail.com> wrote:
>>
>>> Hi, Nicholas,
>>>I am not a fortran guy, but I will try to add
>>> petscsfcreatesectionsf.
>>>
>>>Thanks.
>>> --Junchao Zhang
>>>
>>>
>>> On Tue, Jan 10, 2023 at 12:50 AM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 I think it should be something like this, but I'm not very fluent
 in Fortran C interop syntax. Any advice would be appreciated. Thanks

 PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection
 *rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
 PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO
 (remoteoffsetsd))
 {

   int * remoteOffsets;
   *ierr = F90Array1dAccess(aremoteOffsets, 

Re: [petsc-users] PetscSF Fortran interface

2023-01-12 Thread Junchao Zhang
On Thu, Jan 12, 2023 at 1:28 PM Nicholas Arnold-Medabalimi <
narno...@umich.edu> wrote:

> Hi Junchao
>
> Going back to this merge request. I'm not sure I follow exactly the usage
> described in the commit history. I have a c prototype of what I am trying
> to do which is
>
> PetscSectionCreate(PETSC_COMM_WORLD, );
> PetscSFDistributeSection(redistributionSF, filteredSection_local, 
> ,
> leafSection);
> PetscSFCreateSectionSF(redistributionSF, filteredSection_local,
> remoteOffsets, leafSection, _dof);
>
> But something seems unclear with the usage in fortran around the
> remoteoffsets. Do I have to insert the CreateRemoteOffsetsF90 like so? Any
> clarification would be greatly appreciated.
>
> call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
> remoteoffsets, leafSection, ierr)
> call PetscSFCreateRemoteOffsetsf90(distributionSF, section_filt_l,
> leafSection, remoteoffsets, ierr )
> call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
> remoteoffsets, leafSection, distributionSF_dof, ierr)
>
> Hi, Nicholas,
  Reading through comments at
https://gitlab.com/petsc/petsc/-/merge_requests/5386#note_1022942470, I
feel it should look like

PetscInt, pointer :: remoteOffsets(:)
call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
remoteoffsets, leafSection, ierr)  // allocate remoteoffsets
call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
remoteoffsets, leafSection, distributionSF_dof, ierr)
call PetscIntArray1dDestroyF90(remoteOffsets,ierr) // free remoteoffsets
when not needed

Could you try it?


Sincerely
> Nicholas
>
> On Tue, Jan 10, 2023 at 4:42 PM Junchao Zhang 
> wrote:
>
>> Hi, Nicholas,
>>It seems we have implemented it, but with another name,
>> PetscSFCreateSectionSFF90, see
>> https://gitlab.com/petsc/petsc/-/merge_requests/5386
>>Try it to see if it works!
>>
>> --Junchao Zhang
>>
>>
>> On Tue, Jan 10, 2023 at 11:45 AM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Er to be honest I still can't get my stub to compile properly, and I
>>> don't know how to go about making a merge request. But here is what I am
>>> attempting right now. Let me know how best to proceed
>>>
>>>
>>> Its not exactly clear to me how to setup up the remote offset properly.
>>>
>>> in src/vec/is/sf/interface/ftn-custom/zsf.c
>>>
>>> PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection
>>> *rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
>>> PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO(remoteoffsetsd))
>>> {
>>>
>>>   int * remoteOffsets;
>>>   *ierr = F90Array1dAccess(aremoteOffsets, PETSC_INT, (void**)
>>>  PETSC_F90_2PTR_PARAM(remoteoffsetsd));if (*ierr) return;
>>>   *ierr = PetscSFCreateSectionSF(*sf,*rootSection,
>>> ,*leafSection,*sectionSF);if (*ierr) return;
>>>
>>> }
>>>
>>> This is the sticking point.
>>>
>>> Sincerely
>>> Nicholas
>>>
>>>
>>> On Tue, Jan 10, 2023 at 12:38 PM Junchao Zhang 
>>> wrote:
>>>
 Hi, Nicholas,
   Could you make a merge request to PETSc and then our Fortran experts
 can comment on your MR?
   Thanks.

 --Junchao Zhang


 On Tue, Jan 10, 2023 at 11:10 AM Nicholas Arnold-Medabalimi <
 narno...@umich.edu> wrote:

> Hi Junchao
>
> I think I'm almost there, but I could use some insight into how to use
> the PETSC_F90_2PTR_PROTO and F90Array1dAccess for the remoteOffset
> parameter input so if another function comes up, I can add it myself
> without wasting your time.
> I am very grateful for your help and time.
>
> Sincerely
> Nicholas
>
> On Tue, Jan 10, 2023 at 10:55 AM Junchao Zhang <
> junchao.zh...@gmail.com> wrote:
>
>> Hi, Nicholas,
>>I am not a fortran guy, but I will try to add
>> petscsfcreatesectionsf.
>>
>>Thanks.
>> --Junchao Zhang
>>
>>
>> On Tue, Jan 10, 2023 at 12:50 AM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> I think it should be something like this, but I'm not very fluent in
>>> Fortran C interop syntax. Any advice would be appreciated. Thanks
>>>
>>> PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection *
>>> rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
>>> PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO(remoteoffsetsd))
>>> {
>>>
>>>   int * remoteOffsets;
>>>   *ierr = F90Array1dAccess(aremoteOffsets, PETSC_INT, (void**) &
>>> remoteOffsets PETSC_F90_2PTR_PARAM(remoteoffsetsd));if (*ierr)
>>> return;
>>>   *ierr = PetscSFCreateSectionSF(*sf,*rootSection, ,*
>>> leafSection,*sectionSF);if (*ierr) return;
>>>
>>> }
>>>
>>> On Mon, Jan 9, 2023 at 11:41 PM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 Hi Junchao

 Thanks again for your help in November. I've been using the your
 merge request branch quite 

Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Jed Brown
Dave May  writes:

> On Thu 12. Jan 2023 at 17:58, Blaise Bourdin  wrote:
>
>> Out of curiosity, what is the rationale for _reading_ high order gmsh
>> meshes?
>>
>
> GMSH can use a CAD engine like OpenCascade. This provides geometric
> representations via things like BSplines. Such geometric representation are
> not exposed to the users application code, nor are they embedded in any
> mesh format GMSH emits. The next best thing is to use a high order
> representation of the mesh geometry and project the CAD geometry (say a
> BSpline) into this higher order function space. The projection of the
> geometry is a quantity that can be described with the .msh format.

Adding to this, efficient methods for volumes with concave surfaces *must* use 
at least quadratic geometry. See Figure 5, where "p-refinement with linear 
geometry" causes anti-convergence (due the spurious stress singularities from 
the linear geometry, visible in Figure 4) while p-refinement with quadratic 
geometry is vastly more efficient despite the physical stress singularities 
that prevent exponential convergence. 

https://arxiv.org/pdf/2204.01722.pdf


Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Matthew Knepley
On Thu, Jan 12, 2023 at 4:10 PM Dave May  wrote:

> On Thu 12. Jan 2023 at 17:58, Blaise Bourdin  wrote:
>
>> Out of curiosity, what is the rationale for _reading_ high order gmsh
>> meshes?
>>
>
> GMSH can use a CAD engine like OpenCascade. This provides geometric
> representations via things like BSplines. Such geometric representation are
> not exposed to the users application code, nor are they embedded in any
> mesh format GMSH emits. The next best thing is to use a high order
> representation of the mesh geometry and project the CAD geometry (say a
> BSpline) into this higher order function space. The projection of the
> geometry is a quantity that can be described with the .msh format.
>

Note that PETSc can directly read CAD files now and mesh over them.


> Is it so that one can write data back in native gmsh format?
>>
>
> No.
>
> Cheers,
> Dave
>
>>
>
>
>
>> Regards,
>> Blaise
>>
>>
>> On Jan 12, 2023, at 7:13 PM, Matthew Knepley  wrote:
>>
>> On Thu, Jan 12, 2023 at 1:33 PM Jed Brown  wrote:
>>
>>> It's confusing, but this line makes high order simplices always read as
>>> discontinuous coordinate spaces. I would love if someone would revisit
>>> that, perhaps also using DMPlexSetIsoperiodicFaceSF(),
>>
>>
>> Perhaps as a switch, but there is no way I am getting rid of the current
>> periodicity. As we have discussed before, breaking the topological relation
>> is a non-starter for me.
>>
>> It does look like higher order Gmsh does read as DG. We can just project
>> that to CG for non-periodic stuff.
>>
>>   Thanks,
>>
>> Matt
>>
>> which should simplify the code and avoid the confusing cell coordinates
>>> pattern. Sadly, I don't have time to dive in.
>>>
>>>
>>> https://gitlab.com/petsc/petsc/-/commit/066ea43f7f75752f012be6cd06b6107ebe84cc6d#3616cad8148970af5b97293c49492ff893e25b59_1552_1724
>>>
>>> "Daniel R. Shapero"  writes:
>>>
>>> > Sorry either your mail system or mine prevented me from attaching the
>>> file,
>>> > so I put it on pastebin:
>>> > https://pastebin.com/awFpc1Js
>>> >
>>> > On Wed, Jan 11, 2023 at 4:54 PM Matthew Knepley 
>>> wrote:
>>> >
>>> >> Can you send the .msh file? I still have not installed Gmsh :)
>>> >>
>>> >>   Thanks,
>>> >>
>>> >>  Matt
>>> >>
>>> >> On Wed, Jan 11, 2023 at 2:43 PM Daniel R. Shapero 
>>> wrote:
>>> >>
>>> >>> Hi all -- I'm trying to read in 2nd-order / piecewise quadratic
>>> meshes
>>> >>> that are generated by gmsh and I don't understand how the
>>> coordinates are
>>> >>> stored in the plex. I've been discussing this with Matt Knepley here
>>> >>> <
>>> https://urldefense.com/v3/__https://github.com/firedrakeproject/firedrake/issues/982__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2gOStva7A$
>>> >
>>> >>> as it pertains to Firedrake but I think this is more an issue at the
>>> PETSc
>>> >>> level.
>>> >>>
>>> >>> This code
>>> >>> <
>>> https://urldefense.com/v3/__https://gist.github.com/danshapero/a140daaf951ba58c48285ec29f5973cc__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2hho2eD1g$
>>> >
>>> >>> uses gmsh to generate a 2nd-order mesh of the unit disk, read it
>>> into a
>>> >>> DMPlex, print out the number of cells in each depth stratum, and
>>> finally
>>> >>> print a view of the coordinate DM's section. The resulting mesh has
>>> 64
>>> >>> triangles, 104 edges, and 41 vertices. For 2nd-order meshes, I'd
>>> expected
>>> >>> there to be 2 degrees of freedom at each node and 2 at each edge. The
>>> >>> output is:
>>> >>>
>>> >>> ```
>>> >>> Depth strata: [(64, 105), (105, 209), (0, 64)]
>>> >>>
>>> >>> PetscSection Object: 1 MPI process
>>> >>>   type not yet set
>>> >>> 1 fields
>>> >>>   field 0 with 2 components
>>> >>> Process 0:
>>> >>>   (   0) dim 12 offset   0
>>> >>>   (   1) dim 12 offset  12
>>> >>>   (   2) dim 12 offset  24
>>> >>> ...
>>> >>>   (  62) dim 12 offset 744
>>> >>>   (  63) dim 12 offset 756
>>> >>>   (  64) dim  0 offset 768
>>> >>>   (  65) dim  0 offset 768
>>> >>> ...
>>> >>>   ( 207) dim  0 offset 768
>>> >>>   ( 208) dim  0 offset 768
>>> >>>   PetscSectionSym Object: 1 MPI process
>>> >>> type: label
>>> >>> Label 'depth'
>>> >>> Symmetry for stratum value 0 (0 dofs per point): no symmetries
>>> >>> Symmetry for stratum value 1 (0 dofs per point): no symmetries
>>> >>> Symmetry for stratum value 2 (12 dofs per point):
>>> >>>   Orientation range: [-3, 3)
>>> >>> Symmetry for stratum value -1 (0 dofs per point): no symmetries
>>> >>> ```
>>> >>>
>>> >>> The output suggests that there are 12 degrees of freedom in each
>>> >>> triangle. That would mean the coordinate field is discontinuous
>>> across cell
>>> >>> boundaries. Can someone explain what's going on? I tried reading the
>>> .msh
>>> >>> file but it's totally inscrutable to me. I'm happy to RTFSC if
>>> someone
>>> >>> points me in the right direction. Matt tells me that the coordinate
>>> field
>>> >>> should only be 

Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Dave May
On Thu 12. Jan 2023 at 17:58, Blaise Bourdin  wrote:

> Out of curiosity, what is the rationale for _reading_ high order gmsh
> meshes?
>

GMSH can use a CAD engine like OpenCascade. This provides geometric
representations via things like BSplines. Such geometric representation are
not exposed to the users application code, nor are they embedded in any
mesh format GMSH emits. The next best thing is to use a high order
representation of the mesh geometry and project the CAD geometry (say a
BSpline) into this higher order function space. The projection of the
geometry is a quantity that can be described with the .msh format.

Is it so that one can write data back in native gmsh format?
>

No.

Cheers,
Dave

>



> Regards,
> Blaise
>
>
> On Jan 12, 2023, at 7:13 PM, Matthew Knepley  wrote:
>
> On Thu, Jan 12, 2023 at 1:33 PM Jed Brown  wrote:
>
>> It's confusing, but this line makes high order simplices always read as
>> discontinuous coordinate spaces. I would love if someone would revisit
>> that, perhaps also using DMPlexSetIsoperiodicFaceSF(),
>
>
> Perhaps as a switch, but there is no way I am getting rid of the current
> periodicity. As we have discussed before, breaking the topological relation
> is a non-starter for me.
>
> It does look like higher order Gmsh does read as DG. We can just project
> that to CG for non-periodic stuff.
>
>   Thanks,
>
> Matt
>
> which should simplify the code and avoid the confusing cell coordinates
>> pattern. Sadly, I don't have time to dive in.
>>
>>
>> https://gitlab.com/petsc/petsc/-/commit/066ea43f7f75752f012be6cd06b6107ebe84cc6d#3616cad8148970af5b97293c49492ff893e25b59_1552_1724
>>
>> "Daniel R. Shapero"  writes:
>>
>> > Sorry either your mail system or mine prevented me from attaching the
>> file,
>> > so I put it on pastebin:
>> > https://pastebin.com/awFpc1Js
>> >
>> > On Wed, Jan 11, 2023 at 4:54 PM Matthew Knepley 
>> wrote:
>> >
>> >> Can you send the .msh file? I still have not installed Gmsh :)
>> >>
>> >>   Thanks,
>> >>
>> >>  Matt
>> >>
>> >> On Wed, Jan 11, 2023 at 2:43 PM Daniel R. Shapero 
>> wrote:
>> >>
>> >>> Hi all -- I'm trying to read in 2nd-order / piecewise quadratic meshes
>> >>> that are generated by gmsh and I don't understand how the coordinates
>> are
>> >>> stored in the plex. I've been discussing this with Matt Knepley here
>> >>> <
>> https://urldefense.com/v3/__https://github.com/firedrakeproject/firedrake/issues/982__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2gOStva7A$
>> >
>> >>> as it pertains to Firedrake but I think this is more an issue at the
>> PETSc
>> >>> level.
>> >>>
>> >>> This code
>> >>> <
>> https://urldefense.com/v3/__https://gist.github.com/danshapero/a140daaf951ba58c48285ec29f5973cc__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2hho2eD1g$
>> >
>> >>> uses gmsh to generate a 2nd-order mesh of the unit disk, read it into
>> a
>> >>> DMPlex, print out the number of cells in each depth stratum, and
>> finally
>> >>> print a view of the coordinate DM's section. The resulting mesh has 64
>> >>> triangles, 104 edges, and 41 vertices. For 2nd-order meshes, I'd
>> expected
>> >>> there to be 2 degrees of freedom at each node and 2 at each edge. The
>> >>> output is:
>> >>>
>> >>> ```
>> >>> Depth strata: [(64, 105), (105, 209), (0, 64)]
>> >>>
>> >>> PetscSection Object: 1 MPI process
>> >>>   type not yet set
>> >>> 1 fields
>> >>>   field 0 with 2 components
>> >>> Process 0:
>> >>>   (   0) dim 12 offset   0
>> >>>   (   1) dim 12 offset  12
>> >>>   (   2) dim 12 offset  24
>> >>> ...
>> >>>   (  62) dim 12 offset 744
>> >>>   (  63) dim 12 offset 756
>> >>>   (  64) dim  0 offset 768
>> >>>   (  65) dim  0 offset 768
>> >>> ...
>> >>>   ( 207) dim  0 offset 768
>> >>>   ( 208) dim  0 offset 768
>> >>>   PetscSectionSym Object: 1 MPI process
>> >>> type: label
>> >>> Label 'depth'
>> >>> Symmetry for stratum value 0 (0 dofs per point): no symmetries
>> >>> Symmetry for stratum value 1 (0 dofs per point): no symmetries
>> >>> Symmetry for stratum value 2 (12 dofs per point):
>> >>>   Orientation range: [-3, 3)
>> >>> Symmetry for stratum value -1 (0 dofs per point): no symmetries
>> >>> ```
>> >>>
>> >>> The output suggests that there are 12 degrees of freedom in each
>> >>> triangle. That would mean the coordinate field is discontinuous
>> across cell
>> >>> boundaries. Can someone explain what's going on? I tried reading the
>> .msh
>> >>> file but it's totally inscrutable to me. I'm happy to RTFSC if someone
>> >>> points me in the right direction. Matt tells me that the coordinate
>> field
>> >>> should only be discontinuous if the mesh is periodic, but this mesh
>> >>> shouldn't be periodic.
>> >>>
>> >>
>> >>
>> >> --
>> >> What most experimenters take for granted before they begin their
>> >> experiments is infinitely more interesting than any results to which
>> their
>> >> experiments lead.
>> >> 

Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Matthew Knepley
On Thu, Jan 12, 2023 at 3:57 PM Blaise Bourdin  wrote:

> Out of curiosity, what is the rationale for _reading_ high order gmsh
> meshes?
> Is it so that one can write data back in native gmsh format?
>

So we can use meshes that other people design I think.

   Matt


> Regards,
> Blaise
>
>
> On Jan 12, 2023, at 7:13 PM, Matthew Knepley  wrote:
>
> On Thu, Jan 12, 2023 at 1:33 PM Jed Brown  wrote:
>
>> It's confusing, but this line makes high order simplices always read as
>> discontinuous coordinate spaces. I would love if someone would revisit
>> that, perhaps also using DMPlexSetIsoperiodicFaceSF(),
>
>
> Perhaps as a switch, but there is no way I am getting rid of the current
> periodicity. As we have discussed before, breaking the topological relation
> is a non-starter for me.
>
> It does look like higher order Gmsh does read as DG. We can just project
> that to CG for non-periodic stuff.
>
>   Thanks,
>
> Matt
>
> which should simplify the code and avoid the confusing cell coordinates
>> pattern. Sadly, I don't have time to dive in.
>>
>>
>> https://gitlab.com/petsc/petsc/-/commit/066ea43f7f75752f012be6cd06b6107ebe84cc6d#3616cad8148970af5b97293c49492ff893e25b59_1552_1724
>>
>> "Daniel R. Shapero"  writes:
>>
>> > Sorry either your mail system or mine prevented me from attaching the
>> file,
>> > so I put it on pastebin:
>> > https://pastebin.com/awFpc1Js
>> >
>> > On Wed, Jan 11, 2023 at 4:54 PM Matthew Knepley 
>> wrote:
>> >
>> >> Can you send the .msh file? I still have not installed Gmsh :)
>> >>
>> >>   Thanks,
>> >>
>> >>  Matt
>> >>
>> >> On Wed, Jan 11, 2023 at 2:43 PM Daniel R. Shapero 
>> wrote:
>> >>
>> >>> Hi all -- I'm trying to read in 2nd-order / piecewise quadratic meshes
>> >>> that are generated by gmsh and I don't understand how the coordinates
>> are
>> >>> stored in the plex. I've been discussing this with Matt Knepley here
>> >>> <
>> https://urldefense.com/v3/__https://github.com/firedrakeproject/firedrake/issues/982__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2gOStva7A$
>> >
>> >>> as it pertains to Firedrake but I think this is more an issue at the
>> PETSc
>> >>> level.
>> >>>
>> >>> This code
>> >>> <
>> https://urldefense.com/v3/__https://gist.github.com/danshapero/a140daaf951ba58c48285ec29f5973cc__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2hho2eD1g$
>> >
>> >>> uses gmsh to generate a 2nd-order mesh of the unit disk, read it into
>> a
>> >>> DMPlex, print out the number of cells in each depth stratum, and
>> finally
>> >>> print a view of the coordinate DM's section. The resulting mesh has 64
>> >>> triangles, 104 edges, and 41 vertices. For 2nd-order meshes, I'd
>> expected
>> >>> there to be 2 degrees of freedom at each node and 2 at each edge. The
>> >>> output is:
>> >>>
>> >>> ```
>> >>> Depth strata: [(64, 105), (105, 209), (0, 64)]
>> >>>
>> >>> PetscSection Object: 1 MPI process
>> >>>   type not yet set
>> >>> 1 fields
>> >>>   field 0 with 2 components
>> >>> Process 0:
>> >>>   (   0) dim 12 offset   0
>> >>>   (   1) dim 12 offset  12
>> >>>   (   2) dim 12 offset  24
>> >>> ...
>> >>>   (  62) dim 12 offset 744
>> >>>   (  63) dim 12 offset 756
>> >>>   (  64) dim  0 offset 768
>> >>>   (  65) dim  0 offset 768
>> >>> ...
>> >>>   ( 207) dim  0 offset 768
>> >>>   ( 208) dim  0 offset 768
>> >>>   PetscSectionSym Object: 1 MPI process
>> >>> type: label
>> >>> Label 'depth'
>> >>> Symmetry for stratum value 0 (0 dofs per point): no symmetries
>> >>> Symmetry for stratum value 1 (0 dofs per point): no symmetries
>> >>> Symmetry for stratum value 2 (12 dofs per point):
>> >>>   Orientation range: [-3, 3)
>> >>> Symmetry for stratum value -1 (0 dofs per point): no symmetries
>> >>> ```
>> >>>
>> >>> The output suggests that there are 12 degrees of freedom in each
>> >>> triangle. That would mean the coordinate field is discontinuous
>> across cell
>> >>> boundaries. Can someone explain what's going on? I tried reading the
>> .msh
>> >>> file but it's totally inscrutable to me. I'm happy to RTFSC if someone
>> >>> points me in the right direction. Matt tells me that the coordinate
>> field
>> >>> should only be discontinuous if the mesh is periodic, but this mesh
>> >>> shouldn't be periodic.
>> >>>
>> >>
>> >>
>> >> --
>> >> What most experimenters take for granted before they begin their
>> >> experiments is infinitely more interesting than any results to which
>> their
>> >> experiments lead.
>> >> -- Norbert Wiener
>> >>
>> >> https://www.cse.buffalo.edu/~knepley/
>> >> <
>> https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2go23tjRg$
>> >
>> >>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener

Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Blaise Bourdin



Out of curiosity, what is the rationale for _reading_ high order gmsh meshes?
Is it so that one can write data back in native gmsh format? 


Regards,
Blaise



On Jan 12, 2023, at 7:13 PM, Matthew Knepley  wrote:



On Thu, Jan 12, 2023 at 1:33 PM Jed Brown  wrote:



It's confusing, but this line makes high order simplices always read as discontinuous coordinate spaces. I would love if someone would revisit that, perhaps also using DMPlexSetIsoperiodicFaceSF(),


Perhaps as a switch, but there is no way I am getting rid of the current periodicity. As we have discussed before, breaking the topological relation is a non-starter for me.


It does look like higher order Gmsh does read as DG. We can just project that to CG for non-periodic stuff.


  Thanks,


    Matt



which should simplify the code and avoid the confusing cell coordinates pattern. Sadly, I don't have time to dive in.

https://gitlab.com/petsc/petsc/-/commit/066ea43f7f75752f012be6cd06b6107ebe84cc6d#3616cad8148970af5b97293c49492ff893e25b59_1552_1724

"Daniel R. Shapero"  writes:

> Sorry either your mail system or mine prevented me from attaching the file,
> so I put it on pastebin:
> https://pastebin.com/awFpc1Js
>
> On Wed, Jan 11, 2023 at 4:54 PM Matthew Knepley  wrote:
>
>> Can you send the .msh file? I still have not installed Gmsh :)
>>
>>   Thanks,
>>
>>      Matt
>>
>> On Wed, Jan 11, 2023 at 2:43 PM Daniel R. Shapero  wrote:
>>
>>> Hi all -- I'm trying to read in 2nd-order / piecewise quadratic meshes
>>> that are generated by gmsh and I don't understand how the coordinates are
>>> stored in the plex. I've been discussing this with Matt Knepley here
>>> 
>>> as it pertains to Firedrake but I think this is more an issue at the PETSc
>>> level.
>>>
>>> This code
>>> 
>>> uses gmsh to generate a 2nd-order mesh of the unit disk, read it into a
>>> DMPlex, print out the number of cells in each depth stratum, and finally
>>> print a view of the coordinate DM's section. The resulting mesh has 64
>>> triangles, 104 edges, and 41 vertices. For 2nd-order meshes, I'd expected
>>> there to be 2 degrees of freedom at each node and 2 at each edge. The
>>> output is:
>>>
>>> ```
>>> Depth strata: [(64, 105), (105, 209), (0, 64)]
>>>
>>> PetscSection Object: 1 MPI process
>>>   type not yet set
>>> 1 fields
>>>   field 0 with 2 components
>>> Process 0:
>>>   (   0) dim 12 offset   0
>>>   (   1) dim 12 offset  12
>>>   (   2) dim 12 offset  24
>>> ...
>>>   (  62) dim 12 offset 744
>>>   (  63) dim 12 offset 756
>>>   (  64) dim  0 offset 768
>>>   (  65) dim  0 offset 768
>>> ...
>>>   ( 207) dim  0 offset 768
>>>   ( 208) dim  0 offset 768
>>>   PetscSectionSym Object: 1 MPI process
>>>     type: label
>>>     Label 'depth'
>>>     Symmetry for stratum value 0 (0 dofs per point): no symmetries
>>>     Symmetry for stratum value 1 (0 dofs per point): no symmetries
>>>     Symmetry for stratum value 2 (12 dofs per point):
>>>       Orientation range: [-3, 3)
>>>     Symmetry for stratum value -1 (0 dofs per point): no symmetries
>>> ```
>>>
>>> The output suggests that there are 12 degrees of freedom in each
>>> triangle. That would mean the coordinate field is discontinuous across cell
>>> boundaries. Can someone explain what's going on? I tried reading the .msh
>>> file but it's totally inscrutable to me. I'm happy to RTFSC if someone
>>> points me in the right direction. Matt tells me that the coordinate field
>>> should only be discontinuous if the mesh is periodic, but this mesh
>>> shouldn't be periodic.
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>





-- 






What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener


https://www.cse.buffalo.edu/~knepley/



























— 
Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, 

Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Matthew Knepley
On Thu, Jan 12, 2023 at 1:33 PM Jed Brown  wrote:

> It's confusing, but this line makes high order simplices always read as
> discontinuous coordinate spaces. I would love if someone would revisit
> that, perhaps also using DMPlexSetIsoperiodicFaceSF(),


Perhaps as a switch, but there is no way I am getting rid of the current
periodicity. As we have discussed before, breaking the topological relation
is a non-starter for me.

It does look like higher order Gmsh does read as DG. We can just project
that to CG for non-periodic stuff.

  Thanks,

Matt

which should simplify the code and avoid the confusing cell coordinates
> pattern. Sadly, I don't have time to dive in.
>
>
> https://gitlab.com/petsc/petsc/-/commit/066ea43f7f75752f012be6cd06b6107ebe84cc6d#3616cad8148970af5b97293c49492ff893e25b59_1552_1724
>
> "Daniel R. Shapero"  writes:
>
> > Sorry either your mail system or mine prevented me from attaching the
> file,
> > so I put it on pastebin:
> > https://pastebin.com/awFpc1Js
> >
> > On Wed, Jan 11, 2023 at 4:54 PM Matthew Knepley 
> wrote:
> >
> >> Can you send the .msh file? I still have not installed Gmsh :)
> >>
> >>   Thanks,
> >>
> >>  Matt
> >>
> >> On Wed, Jan 11, 2023 at 2:43 PM Daniel R. Shapero 
> wrote:
> >>
> >>> Hi all -- I'm trying to read in 2nd-order / piecewise quadratic meshes
> >>> that are generated by gmsh and I don't understand how the coordinates
> are
> >>> stored in the plex. I've been discussing this with Matt Knepley here
> >>> <
> https://urldefense.com/v3/__https://github.com/firedrakeproject/firedrake/issues/982__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2gOStva7A$
> >
> >>> as it pertains to Firedrake but I think this is more an issue at the
> PETSc
> >>> level.
> >>>
> >>> This code
> >>> <
> https://urldefense.com/v3/__https://gist.github.com/danshapero/a140daaf951ba58c48285ec29f5973cc__;!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2hho2eD1g$
> >
> >>> uses gmsh to generate a 2nd-order mesh of the unit disk, read it into a
> >>> DMPlex, print out the number of cells in each depth stratum, and
> finally
> >>> print a view of the coordinate DM's section. The resulting mesh has 64
> >>> triangles, 104 edges, and 41 vertices. For 2nd-order meshes, I'd
> expected
> >>> there to be 2 degrees of freedom at each node and 2 at each edge. The
> >>> output is:
> >>>
> >>> ```
> >>> Depth strata: [(64, 105), (105, 209), (0, 64)]
> >>>
> >>> PetscSection Object: 1 MPI process
> >>>   type not yet set
> >>> 1 fields
> >>>   field 0 with 2 components
> >>> Process 0:
> >>>   (   0) dim 12 offset   0
> >>>   (   1) dim 12 offset  12
> >>>   (   2) dim 12 offset  24
> >>> ...
> >>>   (  62) dim 12 offset 744
> >>>   (  63) dim 12 offset 756
> >>>   (  64) dim  0 offset 768
> >>>   (  65) dim  0 offset 768
> >>> ...
> >>>   ( 207) dim  0 offset 768
> >>>   ( 208) dim  0 offset 768
> >>>   PetscSectionSym Object: 1 MPI process
> >>> type: label
> >>> Label 'depth'
> >>> Symmetry for stratum value 0 (0 dofs per point): no symmetries
> >>> Symmetry for stratum value 1 (0 dofs per point): no symmetries
> >>> Symmetry for stratum value 2 (12 dofs per point):
> >>>   Orientation range: [-3, 3)
> >>> Symmetry for stratum value -1 (0 dofs per point): no symmetries
> >>> ```
> >>>
> >>> The output suggests that there are 12 degrees of freedom in each
> >>> triangle. That would mean the coordinate field is discontinuous across
> cell
> >>> boundaries. Can someone explain what's going on? I tried reading the
> .msh
> >>> file but it's totally inscrutable to me. I'm happy to RTFSC if someone
> >>> points me in the right direction. Matt tells me that the coordinate
> field
> >>> should only be discontinuous if the mesh is periodic, but this mesh
> >>> shouldn't be periodic.
> >>>
> >>
> >>
> >> --
> >> What most experimenters take for granted before they begin their
> >> experiments is infinitely more interesting than any results to which
> their
> >> experiments lead.
> >> -- Norbert Wiener
> >>
> >> https://www.cse.buffalo.edu/~knepley/
> >> <
> https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!K-Hz7m0Vt54!hL9WLR51ieyHFZx8N9AjhDwJCRpvmQto9CL1XOTkkAxFfUbtsabHuBDOATnWyP6lQszhA2go23tjRg$
> >
> >>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] coordinate degrees of freedom for 2nd-order gmsh mesh

2023-01-12 Thread Jed Brown
It's confusing, but this line makes high order simplices always read as 
discontinuous coordinate spaces. I would love if someone would revisit that, 
perhaps also using DMPlexSetIsoperiodicFaceSF(), which should simplify the code 
and avoid the confusing cell coordinates pattern. Sadly, I don't have time to 
dive in.

https://gitlab.com/petsc/petsc/-/commit/066ea43f7f75752f012be6cd06b6107ebe84cc6d#3616cad8148970af5b97293c49492ff893e25b59_1552_1724

"Daniel R. Shapero"  writes:

> Sorry either your mail system or mine prevented me from attaching the file,
> so I put it on pastebin:
> https://pastebin.com/awFpc1Js
>
> On Wed, Jan 11, 2023 at 4:54 PM Matthew Knepley  wrote:
>
>> Can you send the .msh file? I still have not installed Gmsh :)
>>
>>   Thanks,
>>
>>  Matt
>>
>> On Wed, Jan 11, 2023 at 2:43 PM Daniel R. Shapero  wrote:
>>
>>> Hi all -- I'm trying to read in 2nd-order / piecewise quadratic meshes
>>> that are generated by gmsh and I don't understand how the coordinates are
>>> stored in the plex. I've been discussing this with Matt Knepley here
>>> 
>>> as it pertains to Firedrake but I think this is more an issue at the PETSc
>>> level.
>>>
>>> This code
>>> 
>>> uses gmsh to generate a 2nd-order mesh of the unit disk, read it into a
>>> DMPlex, print out the number of cells in each depth stratum, and finally
>>> print a view of the coordinate DM's section. The resulting mesh has 64
>>> triangles, 104 edges, and 41 vertices. For 2nd-order meshes, I'd expected
>>> there to be 2 degrees of freedom at each node and 2 at each edge. The
>>> output is:
>>>
>>> ```
>>> Depth strata: [(64, 105), (105, 209), (0, 64)]
>>>
>>> PetscSection Object: 1 MPI process
>>>   type not yet set
>>> 1 fields
>>>   field 0 with 2 components
>>> Process 0:
>>>   (   0) dim 12 offset   0
>>>   (   1) dim 12 offset  12
>>>   (   2) dim 12 offset  24
>>> ...
>>>   (  62) dim 12 offset 744
>>>   (  63) dim 12 offset 756
>>>   (  64) dim  0 offset 768
>>>   (  65) dim  0 offset 768
>>> ...
>>>   ( 207) dim  0 offset 768
>>>   ( 208) dim  0 offset 768
>>>   PetscSectionSym Object: 1 MPI process
>>> type: label
>>> Label 'depth'
>>> Symmetry for stratum value 0 (0 dofs per point): no symmetries
>>> Symmetry for stratum value 1 (0 dofs per point): no symmetries
>>> Symmetry for stratum value 2 (12 dofs per point):
>>>   Orientation range: [-3, 3)
>>> Symmetry for stratum value -1 (0 dofs per point): no symmetries
>>> ```
>>>
>>> The output suggests that there are 12 degrees of freedom in each
>>> triangle. That would mean the coordinate field is discontinuous across cell
>>> boundaries. Can someone explain what's going on? I tried reading the .msh
>>> file but it's totally inscrutable to me. I'm happy to RTFSC if someone
>>> points me in the right direction. Matt tells me that the coordinate field
>>> should only be discontinuous if the mesh is periodic, but this mesh
>>> shouldn't be periodic.
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>


Re: [petsc-users] PetscSF Fortran interface

2023-01-12 Thread Nicholas Arnold-Medabalimi
Hi Junchao

Going back to this merge request. I'm not sure I follow exactly the usage
described in the commit history. I have a c prototype of what I am trying
to do which is

PetscSectionCreate(PETSC_COMM_WORLD, );
PetscSFDistributeSection(redistributionSF, filteredSection_local,
,
leafSection);
PetscSFCreateSectionSF(redistributionSF, filteredSection_local,
remoteOffsets, leafSection, _dof);

But something seems unclear with the usage in fortran around the
remoteoffsets. Do I have to insert the CreateRemoteOffsetsF90 like so? Any
clarification would be greatly appreciated.

call PetscSFDistributeSectionF90(distributionSF, section_filt_l,
remoteoffsets, leafSection, ierr)
call PetscSFCreateRemoteOffsetsf90(distributionSF, section_filt_l,
leafSection, remoteoffsets, ierr )
call PetscSFCreateSectionSFF90(distributionSF, section_filt_l,
remoteoffsets, leafSection, distributionSF_dof, ierr)

Sincerely
Nicholas

On Tue, Jan 10, 2023 at 4:42 PM Junchao Zhang 
wrote:

> Hi, Nicholas,
>It seems we have implemented it, but with another name,
> PetscSFCreateSectionSFF90, see
> https://gitlab.com/petsc/petsc/-/merge_requests/5386
>Try it to see if it works!
>
> --Junchao Zhang
>
>
> On Tue, Jan 10, 2023 at 11:45 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Er to be honest I still can't get my stub to compile properly, and I
>> don't know how to go about making a merge request. But here is what I am
>> attempting right now. Let me know how best to proceed
>>
>>
>> Its not exactly clear to me how to setup up the remote offset properly.
>>
>> in src/vec/is/sf/interface/ftn-custom/zsf.c
>>
>> PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection
>> *rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
>> PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO(remoteoffsetsd))
>> {
>>
>>   int * remoteOffsets;
>>   *ierr = F90Array1dAccess(aremoteOffsets, PETSC_INT, (void**)
>>  PETSC_F90_2PTR_PARAM(remoteoffsetsd));if (*ierr) return;
>>   *ierr = PetscSFCreateSectionSF(*sf,*rootSection,
>> ,*leafSection,*sectionSF);if (*ierr) return;
>>
>> }
>>
>> This is the sticking point.
>>
>> Sincerely
>> Nicholas
>>
>>
>> On Tue, Jan 10, 2023 at 12:38 PM Junchao Zhang 
>> wrote:
>>
>>> Hi, Nicholas,
>>>   Could you make a merge request to PETSc and then our Fortran experts
>>> can comment on your MR?
>>>   Thanks.
>>>
>>> --Junchao Zhang
>>>
>>>
>>> On Tue, Jan 10, 2023 at 11:10 AM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 Hi Junchao

 I think I'm almost there, but I could use some insight into how to use
 the PETSC_F90_2PTR_PROTO and F90Array1dAccess for the remoteOffset
 parameter input so if another function comes up, I can add it myself
 without wasting your time.
 I am very grateful for your help and time.

 Sincerely
 Nicholas

 On Tue, Jan 10, 2023 at 10:55 AM Junchao Zhang 
 wrote:

> Hi, Nicholas,
>I am not a fortran guy, but I will try to add
> petscsfcreatesectionsf.
>
>Thanks.
> --Junchao Zhang
>
>
> On Tue, Jan 10, 2023 at 12:50 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> I think it should be something like this, but I'm not very fluent in
>> Fortran C interop syntax. Any advice would be appreciated. Thanks
>>
>> PETSC_EXTERN void petscsfcreatesectionsf(PetscSF *sf, PetscSection *
>> rootSection, F90Array1d *aremoteOffsets, PetscSection *leafSection,
>> PetscSF *sectionSF, int * ierr PETSC_F90_2PTR_PROTO(remoteoffsetsd))
>> {
>>
>>   int * remoteOffsets;
>>   *ierr = F90Array1dAccess(aremoteOffsets, PETSC_INT, (void**) &
>> remoteOffsets PETSC_F90_2PTR_PARAM(remoteoffsetsd));if (*ierr) return
>> ;
>>   *ierr = PetscSFCreateSectionSF(*sf,*rootSection, ,*
>> leafSection,*sectionSF);if (*ierr) return;
>>
>> }
>>
>> On Mon, Jan 9, 2023 at 11:41 PM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Hi Junchao
>>>
>>> Thanks again for your help in November. I've been using the your
>>> merge request branch quite heavily. Would it be possible to add a
>>> petscsfcreatesectionsf interface as well?
>>> I'm trying to write it myself using your commits as a guide but I
>>> have been struggling with handling the section parameter properly.
>>>
>>> Sincerely
>>> Nicholas
>>>
>>> On Sat, Nov 19, 2022 at 9:44 PM Junchao Zhang <
>>> junchao.zh...@gmail.com> wrote:
>>>



 On Sat, Nov 19, 2022 at 8:05 PM Nicholas Arnold-Medabalimi <
 narno...@umich.edu> wrote:

> Hi
>
> Thanks, this is awesome. Thanks for the very prompt fix. Just one
> question: will the array outputs on the fortran side copies (and need 
> to be
> deallocated) or direct access to the dmplex?
>

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2023-01-12 Thread Matteo Semplice


Il 23/12/22 17:14, Matthew Knepley ha scritto:
On Thu, Dec 22, 2022 at 3:08 PM Matteo Semplice 
 wrote:



Il 22/12/22 20:06, Dave May ha scritto:



On Thu 22. Dec 2022 at 10:27, Matteo Semplice
 wrote:

Dear Dave and Matt,

    I am really dealing with two different use cases in a
code that will compute a levelset function passing through a
large set of points. If I had DMSwarmSetMigrateType() and if
it were safe to switch the migration mode back and forth in
the same swarm, this would cover all my use cases here. Is it
safe to add it back to petsc? Details below if you are curious.

1) During preprocessing I am loading a point cloud from disk
(in whatever order it comes) and need to send the particles
to the right ranks. Since the background DM is a DMDA I can
easily figure out the destination rank. This would be covered
by your suggestion not to attach the DM, except that later I
need to locate these points with respect to the background
cells in order to initialize data on the Vecs associated to
the DMDA.

2) Then I need to implement a semilagrangian time evolution
scheme. For this I'd like to send particles around at the
"foot of characteristic", collect data there and then send
them back to the originating point. The first migration would
be based on particle coordinates
(DMSwarmMigrate_DMNeighborScatter and the restriction to only
neighbouring ranks is perfect), while for the second move it
would be easier to just send them back to the originating
rank, which I can easily store in an Int field in the swarm.
Thus at each timestep I'd need to swap migrate types in this
swarm (DMScatter for moving them to the feet and BASIC to
send them back).


When you use BASIC, you would have to explicitly call the point
location routine from your code as BASIC does not interact with
the DM.

Based on what I see in the code, switching  migrate modes between
basic and dmneighbourscatter should be safe.

If you are fine calling the point location from your side then
what you propose should work.


If I understood the code correctly, BASIC will just migrate
particles sending them to what is stored in DMSwarmField_rank,
right? That'd be easy since I can create a SWARM with all the data
I need and an extra int field (say "original_rank") and copy those
values into DMSwarmField_rank before calling migrate for the
"going back" step. After this backward migration I do not need to
locate particles again (e.g. I do not need DMSwarmSortGetAccess
after the BASIC migration, but only after the DMNeighborScatter one).

Thus having back DMSwarmSetMigrateType() should be enough for me.

Hi Matteo,

I have done this in

https://gitlab.com/petsc/petsc/-/merge_requests/5941 



I also hope to get the fix for your DMDA issue in there.


Hi.

I have finally got round to testing the updates and, using the main 
branch, my issues are fixed.


Only, I have noticed that, after a DMSwarmMigrate_DMNeighborScatter, the 
field DMSwarmField_rank has the same content as the field 
DMSwarmPICField_cellid. It does not affect me, but it seems a little 
strange and might surprise users... In the long term, a word in the docs 
about the names/content of the fields that are automatically created in 
a swarm would be helpful.


Thanks!

    Matteo





Re: [petsc-users] PETSC install

2023-01-12 Thread Barry Smith

  If you put the variable in the .bashrc file then you mush 

  source ~/.bashrc 

  before running the 

  make check


  Barry


> On Jan 11, 2023, at 11:52 PM, Sijie Zhang  wrote:
> 
> Yes, I followed the exact instructions. I’m using bash. I put the 
> environmental variable in the .bashrc file. This only happens to my intel 
> i129700 workstation. Is it because of the hardware?
>  
> Thanks.
>  
> Sijie 
>  
> Sent from Mail  for Windows
>  
> From: Matthew Knepley 
> Sent: Wednesday, January 11, 2023 7:40 PM
> To: Barry Smith 
> Cc: Sijie Zhang ; petsc-users@mcs.anl.gov 
> 
> Subject: Re: [petsc-users] PETSC install
>  
>  External Email: Use caution with attachments, links, or sharing data 
>  
> On Wed, Jan 11, 2023 at 2:33 PM Barry Smith  > wrote:
>  
>   Did you do exactly: 
>  
> export HWLOC_HIDE_ERRORS=2
> make check
>  
> Also, what shell are you using? The command above is for bash, but if you use 
> csh it is different.
>  
>   Thanks,
>  
>  Matt
>  
> ?
> 
> 
> 
> 
> On Jan 11, 2023, at 6:51 PM, Sijie Zhang  > wrote:
>  
> Hi,
> 
> I tried that but it's showing the same error.  Can you help me to take a look 
> at that?
> 
> Thanks.
> 
> Sijie
> 
> +
> 
> Running check examples to verify correct installation
> Using PETSC_DIR=/home/zhangsijie1995/Documents/Package/petsc-3.18.3 and 
> PETSC_ARCH=arch-linux-c-debug
> Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
> See https://petsc.org/release/faq/
> hwloc/linux: Ignoring PCI device with non-16bit domain.
> Pass --enable-32bits-pci-domain to configure to support such devices
> (warning: it would break the library ABI, don't enable unless really needed).
> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> Number of SNES iterations = 2
> Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
> See https://petsc.org/release/faq/
> hwloc/linux: Ignoring PCI device with non-16bit domain.
> Pass --enable-32bits-pci-domain to configure to support such devices
> (warning: it would break the library ABI, don't enable unless really needed).
> hwloc/linux: Ignoring PCI device with non-16bit domain.
> Pass --enable-32bits-pci-domain to configure to support such devices
> (warning: it would break the library ABI, don't enable unless really needed).
> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> Number of SNES iterations = 2
> ***Error detected during compile or link!***
> See https://petsc.org/release/faq/
> /home/zhangsijie1995/Documents/Package/petsc-3.18.3/src/snes/tutorials ex5f
> *
> /opt/intel/oneapi/mpi/2021.8.0/bin/mpif90 -fPIC -Wall -ffree-line-length-none 
> -ffree-line-length-0 -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 
>   -I/home/zhangsijie1995/Documents/Package/petsc-3.18.3/include 
> -I/home/zhangsijie1995/Documents/Package/petsc-3.18.3/arch-linux-c-debug/include
>  -I/opt/intel/oneapi/mkl/2023.0.0/include 
> -I/opt/intel/oneapi/mpi/2021.8.0/include ex5f.F90  
> -Wl,-rpath,/home/zhangsijie1995/Documents/Package/petsc-3.18.3/arch-linux-c-debug/lib
>  -L/home/zhangsijie1995/Documents/Package/petsc-3.18.3/arch-linux-c-debug/lib 
> -Wl,-rpath,/opt/intel/oneapi/mkl/2023.0.0/lib/intel64 
> -L/opt/intel/oneapi/mkl/2023.0.0/lib/intel64 
> -Wl,-rpath,/home/zhangsijie1995/Documents/Package/petsc-3.18.3/I_MPI_SUBSTITUTE_INSTALLDIR/lib/release
>  
> -L/home/zhangsijie1995/Documents/Package/petsc-3.18.3/I_MPI_SUBSTITUTE_INSTALLDIR/lib/release
>  
> -Wl,-rpath,/home/zhangsijie1995/Documents/Package/petsc-3.18.3/I_MPI_SUBSTITUTE_INSTALLDIR/lib
>  
> -L/home/zhangsijie1995/Documents/Package/petsc-3.18.3/I_MPI_SUBSTITUTE_INSTALLDIR/lib
>  -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/11 
> -L/usr/lib/gcc/x86_64-linux-gnu/11 -lpetsc -lmkl_intel_lp64 -lmkl_core 
> -lmkl_sequential -lpthread -lm -lstdc++ -ldl -lmpifort -lmpi -lrt -lpthread 
> -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -o ex5f
> f951: Warning: Nonexistent include directory 
> ‘I_MPI_SUBSTITUTE_INSTALLDIR/include/gfortran/11.1.0’ [-Wmissing-include-dirs]
> f951: Warning: Nonexistent include directory 
> ‘I_MPI_SUBSTITUTE_INSTALLDIR/include’ [-Wmissing-include-dirs]
> Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI 
> process
> See https://petsc.org/release/faq/
> hwloc/linux: Ignoring PCI device with non-16bit domain.
> Pass --enable-32bits-pci-domain to configure to support such devices
> (warning: it would break the library ABI, don't enable unless really needed).
> Number of SNES iterations = 3
> Completed test examples
> Error while running make check
> gmake[1]: *** [makefile:149: check] Error 1
> make: