Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Dave May
On Tue 14. Feb 2023 at 21:27, Jed Brown  wrote:

> Dave May  writes:
>
> > On Tue 14. Feb 2023 at 17:17, Jed Brown  wrote:
> >
> >> Can you share a reproducer? I think I recall the format requiring
> certain
> >> things to be Int32.
> >
> >
> > By default, the byte offset used with the appended data format is
> UInt32. I
> > believe that’s where the sizeof(int) is coming from. This default is
> > annoying as it limits the total size of your appended data to be < 3 GB.
> > That said, in the opening of the paraview file you can add this attribute
> >
> > header_type="UInt64"
>
> You mean in the header of the .vtu?


Yeah, within the open VTKFile tag.
Like this
< VTKFile type=“xxx”,  byte_order="LittleEndian" header_type="UInt64" >

Do you happen to have an example or pointers to docs describing this
> feature?


Example yes - will send it to you tomorrow. Docs… not really. Only stuff
like this

https://kitware.github.io/paraview-docs/latest/python/paraview.simple.XMLPStructuredGridWriter.html


https://kitware.github.io/paraview-docs/v5.8.0/python/paraview.simple.XMLMultiBlockDataWriter.html

All the writers seem to support it.


Can we always do this?


Yep!


It isn't mentioned in these:
>
> https://vtk.org/wp-content/uploads/2015/04/file-formats.pdf   (PDF was
> created in 2003)
>
> https://kitware.github.io/vtk-examples/site/VTKFileFormats/#xml-file-formats
>

Yes I know. I’ve tied myself in knots for years because the of the
assumption that the offset had to be an int.

Credit for the discovery goes to Carsten Uphoff. He showed me this.

Cheers,
Dave



> > then the size of the offset is now UInt64 and now large files can be
> > finally written.
> >
> >
> > Cheers,
> > Dave
> >
> >
> >
> >
> >>
> >> Mike Michell  writes:
> >>
> >> > Thanks for the note.
> >> > I understood that PETSc calculates the offsets for me through
> "boffset"
> >> > variable in plexvtu.c file. Please correct me if it is wrong.
> >> >
> >> > If plexvtu.c has a bug, it could be around "write file header" part in
> >> > which the boffset is also computed. Is this correct? I am not using
> >> complex
> >> > number.
> >> > There are several mixed parts among "Int32, UInt8, PetscInt_FMT,
> >> > PetscInt64_FMT" in writing the header.
> >> >
> >> > Which combination of those flags is correct for 64bit indices? I am
> gonna
> >> > modify plexvtu.c file with "#if defined(PETSC_USE_64BIT_INDICES)"
> >> > statement, but I do not know what is the correct form of the header
> flag
> >> > for 64bit indices.
> >> >
> >> > It is also confusing to me:
> >> > boffset += gpiece[r].ncells * sizeof(PetscInt) + sizeof(int);
> >> > How is sizeof(PetscInt) different from sizeof(int)?
> >> >
> >> > Thanks,
> >> > Mike
> >> >
> >> >
> >> >> On Tue, Feb 14, 2023 at 11:45 AM Mike Michell  >
> >> >> wrote:
> >> >>
> >> >>> I was trying to modify the header flags from "Int32" to "Int64", but
> >> the
> >> >>> problem was not resolved. Could I get any additional comments?
> >> >>>
> >> >>
> >> >> The calculated offsets are not correct I think.
> >> >>
> >> >>   Matt
> >> >>
> >> >>
> >> >>> Thanks,
> >> >>> Mike
> >> >>>
> >> >>>
> >>  Thanks for the comments.
> >>  To be precise on the question, the entire part of the header of the
> >> .vtu
> >>  file is attached:
> >> 
> >>  
> >>   >> byte_order="LittleEndian">
> >>    
> >>  
> >>    
> >>   >> NumberOfComponents="3"
> >>  format="appended" offset="0" />
> >>    
> >>    
> >>   >>  NumberOfComponents="1" format="appended" offset="116932" />
> >>   >>   NumberOfComponents="1" format="appended" offset="372936" />
> >>   >>   NumberOfComponents="1" format="appended" offset="404940" />
> >>    
> >>    
> >>   >>  format="appended" offset="408944" />
> >>    
> >>    
> >>   >>  NumberOfComponents="1" format="appended" offset="424948" />
> >>    
> >>  
> >>  
> >>    
> >>   >> NumberOfComponents="3"
> >>  format="appended" offset="463928" />
> >>    
> >>    
> >>   >>  NumberOfComponents="1" format="appended" offset="580860" />
> >>   >>   NumberOfComponents="1" format="appended" offset="836864" />
> >>   >>   NumberOfComponents="1" format="appended" offset="868868" />
> >>    
> >>    
> >>   >>  format="appended" offset="872872" />
> >>    
> >>    
> >>   >>  NumberOfComponents="1" format="appended" offset="76" />
> >>    
> >>  
> >>    
> >>    
> >> 
> >> 
> >>  Thanks,
> >>  Mike
> >> 
> >> 
> >> > On Sun, Feb 12, 2023 at 6:15 PM Mike Michell <
> mi.mike1...@gmail.com>
> >> > wrote:
> >> >
> >> >> Dear PETSc team,
> >> >>
> >> >> I am a user of PETSc with Fortran. My 

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Jed Brown
Dave May  writes:

> On Tue 14. Feb 2023 at 17:17, Jed Brown  wrote:
>
>> Can you share a reproducer? I think I recall the format requiring certain
>> things to be Int32.
>
>
> By default, the byte offset used with the appended data format is UInt32. I
> believe that’s where the sizeof(int) is coming from. This default is
> annoying as it limits the total size of your appended data to be < 3 GB.
> That said, in the opening of the paraview file you can add this attribute
>
> header_type="UInt64"

You mean in the header of the .vtu? Do you happen to have an example or 
pointers to docs describing this feature? Can we always do this? It isn't 
mentioned in these: 

https://vtk.org/wp-content/uploads/2015/04/file-formats.pdf   (PDF was created 
in 2003)
https://kitware.github.io/vtk-examples/site/VTKFileFormats/#xml-file-formats

> then the size of the offset is now UInt64 and now large files can be
> finally written.
>
>
> Cheers,
> Dave
>
>
>
>
>>
>> Mike Michell  writes:
>>
>> > Thanks for the note.
>> > I understood that PETSc calculates the offsets for me through "boffset"
>> > variable in plexvtu.c file. Please correct me if it is wrong.
>> >
>> > If plexvtu.c has a bug, it could be around "write file header" part in
>> > which the boffset is also computed. Is this correct? I am not using
>> complex
>> > number.
>> > There are several mixed parts among "Int32, UInt8, PetscInt_FMT,
>> > PetscInt64_FMT" in writing the header.
>> >
>> > Which combination of those flags is correct for 64bit indices? I am gonna
>> > modify plexvtu.c file with "#if defined(PETSC_USE_64BIT_INDICES)"
>> > statement, but I do not know what is the correct form of the header flag
>> > for 64bit indices.
>> >
>> > It is also confusing to me:
>> > boffset += gpiece[r].ncells * sizeof(PetscInt) + sizeof(int);
>> > How is sizeof(PetscInt) different from sizeof(int)?
>> >
>> > Thanks,
>> > Mike
>> >
>> >
>> >> On Tue, Feb 14, 2023 at 11:45 AM Mike Michell 
>> >> wrote:
>> >>
>> >>> I was trying to modify the header flags from "Int32" to "Int64", but
>> the
>> >>> problem was not resolved. Could I get any additional comments?
>> >>>
>> >>
>> >> The calculated offsets are not correct I think.
>> >>
>> >>   Matt
>> >>
>> >>
>> >>> Thanks,
>> >>> Mike
>> >>>
>> >>>
>>  Thanks for the comments.
>>  To be precise on the question, the entire part of the header of the
>> .vtu
>>  file is attached:
>> 
>>  
>>  > byte_order="LittleEndian">
>>    
>>  
>>    
>>  > NumberOfComponents="3"
>>  format="appended" offset="0" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="116932" />
>>  >   NumberOfComponents="1" format="appended" offset="372936" />
>>  >   NumberOfComponents="1" format="appended" offset="404940" />
>>    
>>    
>>  >  format="appended" offset="408944" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="424948" />
>>    
>>  
>>  
>>    
>>  > NumberOfComponents="3"
>>  format="appended" offset="463928" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="580860" />
>>  >   NumberOfComponents="1" format="appended" offset="836864" />
>>  >   NumberOfComponents="1" format="appended" offset="868868" />
>>    
>>    
>>  >  format="appended" offset="872872" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="76" />
>>    
>>  
>>    
>>    
>> 
>> 
>>  Thanks,
>>  Mike
>> 
>> 
>> > On Sun, Feb 12, 2023 at 6:15 PM Mike Michell 
>> > wrote:
>> >
>> >> Dear PETSc team,
>> >>
>> >> I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
>> >> object. To print out output variable and mesh connectivity, I use
>> VecView()
>> >> by defining PetscSection on that dm and borrow a vector. The type
>> of the
>> >> viewer is set to PETSCVIEWERVTK.
>> >>
>> >> With 32bit indices, the above work flow has no issue. However, if
>> >> PETSc is configured with 64bit indices, my output .vtu file has an
>> error if
>> >> I open the file with visualization tools, such as Paraview or
>> Tecplot,
>> >> saying that:
>> >> "Cannot read cell connectivity from Cells in piece 0 because the
>> >> "offsets" array is not monotonically increasing or starts with a
>> value
>> >> other than 0."
>> >>
>> >> If I open the .vtu file from terminal, I can see such a line:
>> >> ...
>> >> > >> format="appended" offset="580860" />
>> >> ...
>> >>
>> >> I expected "DataArray type="Int64", since the PETSc has 64bit
>> indices.
>> >> Could I get recommendations that I need 

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Dave May
On Tue 14. Feb 2023 at 21:03, Dave May  wrote:

>
>
> On Tue 14. Feb 2023 at 17:17, Jed Brown  wrote:
>
>> Can you share a reproducer? I think I recall the format requiring certain
>> things to be Int32.
>
>
> By default, the byte offset used with the appended data format is UInt32.
> I believe that’s where the sizeof(int) is coming from. This default is
> annoying as it limits the total size of your appended data to be < 3 GB.
>

Oops, I meant to type 4 GB

That said, in the opening of the paraview file you can add this attribute
>
> header_type="UInt64"
>
> then the size of the offset is now UInt64 and now large files can be
> finally written.
>
>
> Cheers,
> Dave
>
>
>
>
>>
>> Mike Michell  writes:
>>
>> > Thanks for the note.
>> > I understood that PETSc calculates the offsets for me through "boffset"
>> > variable in plexvtu.c file. Please correct me if it is wrong.
>> >
>> > If plexvtu.c has a bug, it could be around "write file header" part in
>> > which the boffset is also computed. Is this correct? I am not using
>> complex
>> > number.
>> > There are several mixed parts among "Int32, UInt8, PetscInt_FMT,
>> > PetscInt64_FMT" in writing the header.
>> >
>> > Which combination of those flags is correct for 64bit indices? I am
>> gonna
>> > modify plexvtu.c file with "#if defined(PETSC_USE_64BIT_INDICES)"
>> > statement, but I do not know what is the correct form of the header flag
>> > for 64bit indices.
>> >
>> > It is also confusing to me:
>> > boffset += gpiece[r].ncells * sizeof(PetscInt) + sizeof(int);
>> > How is sizeof(PetscInt) different from sizeof(int)?
>> >
>> > Thanks,
>> > Mike
>> >
>> >
>> >> On Tue, Feb 14, 2023 at 11:45 AM Mike Michell 
>> >> wrote:
>> >>
>> >>> I was trying to modify the header flags from "Int32" to "Int64", but
>> the
>> >>> problem was not resolved. Could I get any additional comments?
>> >>>
>> >>
>> >> The calculated offsets are not correct I think.
>> >>
>> >>   Matt
>> >>
>> >>
>> >>> Thanks,
>> >>> Mike
>> >>>
>> >>>
>>  Thanks for the comments.
>>  To be precise on the question, the entire part of the header of the
>> .vtu
>>  file is attached:
>> 
>>  
>>  > byte_order="LittleEndian">
>>    
>>  
>>    
>>  > NumberOfComponents="3"
>>  format="appended" offset="0" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="116932" />
>>  >   NumberOfComponents="1" format="appended" offset="372936" />
>>  >   NumberOfComponents="1" format="appended" offset="404940" />
>>    
>>    
>>  >  format="appended" offset="408944" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="424948" />
>>    
>>  
>>  
>>    
>>  > NumberOfComponents="3"
>>  format="appended" offset="463928" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="580860" />
>>  >   NumberOfComponents="1" format="appended" offset="836864" />
>>  >   NumberOfComponents="1" format="appended" offset="868868" />
>>    
>>    
>>  >  format="appended" offset="872872" />
>>    
>>    
>>  >  NumberOfComponents="1" format="appended" offset="76" />
>>    
>>  
>>    
>>    
>> 
>> 
>>  Thanks,
>>  Mike
>> 
>> 
>> > On Sun, Feb 12, 2023 at 6:15 PM Mike Michell > >
>> > wrote:
>> >
>> >> Dear PETSc team,
>> >>
>> >> I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
>> >> object. To print out output variable and mesh connectivity, I use
>> VecView()
>> >> by defining PetscSection on that dm and borrow a vector. The type
>> of the
>> >> viewer is set to PETSCVIEWERVTK.
>> >>
>> >> With 32bit indices, the above work flow has no issue. However, if
>> >> PETSc is configured with 64bit indices, my output .vtu file has an
>> error if
>> >> I open the file with visualization tools, such as Paraview or
>> Tecplot,
>> >> saying that:
>> >> "Cannot read cell connectivity from Cells in piece 0 because the
>> >> "offsets" array is not monotonically increasing or starts with a
>> value
>> >> other than 0."
>> >>
>> >> If I open the .vtu file from terminal, I can see such a line:
>> >> ...
>> >> > >> format="appended" offset="580860" />
>> >> ...
>> >>
>> >> I expected "DataArray type="Int64", since the PETSc has 64bit
>> indices.
>> >> Could I get recommendations that I need to check to resolve the
>> issue?
>> >>
>> >
>> > This is probably a bug. We will look at it.
>> >
>> > Jed, I saw that Int32 is hardcoded in plexvtu.c, but
>> sizeof(PetscInt)
>> > is used to calculate the offset, which looks inconsistent. Can 

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Dave May
On Tue 14. Feb 2023 at 17:17, Jed Brown  wrote:

> Can you share a reproducer? I think I recall the format requiring certain
> things to be Int32.


By default, the byte offset used with the appended data format is UInt32. I
believe that’s where the sizeof(int) is coming from. This default is
annoying as it limits the total size of your appended data to be < 3 GB.
That said, in the opening of the paraview file you can add this attribute

header_type="UInt64"

then the size of the offset is now UInt64 and now large files can be
finally written.


Cheers,
Dave




>
> Mike Michell  writes:
>
> > Thanks for the note.
> > I understood that PETSc calculates the offsets for me through "boffset"
> > variable in plexvtu.c file. Please correct me if it is wrong.
> >
> > If plexvtu.c has a bug, it could be around "write file header" part in
> > which the boffset is also computed. Is this correct? I am not using
> complex
> > number.
> > There are several mixed parts among "Int32, UInt8, PetscInt_FMT,
> > PetscInt64_FMT" in writing the header.
> >
> > Which combination of those flags is correct for 64bit indices? I am gonna
> > modify plexvtu.c file with "#if defined(PETSC_USE_64BIT_INDICES)"
> > statement, but I do not know what is the correct form of the header flag
> > for 64bit indices.
> >
> > It is also confusing to me:
> > boffset += gpiece[r].ncells * sizeof(PetscInt) + sizeof(int);
> > How is sizeof(PetscInt) different from sizeof(int)?
> >
> > Thanks,
> > Mike
> >
> >
> >> On Tue, Feb 14, 2023 at 11:45 AM Mike Michell 
> >> wrote:
> >>
> >>> I was trying to modify the header flags from "Int32" to "Int64", but
> the
> >>> problem was not resolved. Could I get any additional comments?
> >>>
> >>
> >> The calculated offsets are not correct I think.
> >>
> >>   Matt
> >>
> >>
> >>> Thanks,
> >>> Mike
> >>>
> >>>
>  Thanks for the comments.
>  To be precise on the question, the entire part of the header of the
> .vtu
>  file is attached:
> 
>  
>   byte_order="LittleEndian">
>    
>  
>    
>   NumberOfComponents="3"
>  format="appended" offset="0" />
>    
>    
>    NumberOfComponents="1" format="appended" offset="116932" />
>     NumberOfComponents="1" format="appended" offset="372936" />
>     NumberOfComponents="1" format="appended" offset="404940" />
>    
>    
>    format="appended" offset="408944" />
>    
>    
>    NumberOfComponents="1" format="appended" offset="424948" />
>    
>  
>  
>    
>   NumberOfComponents="3"
>  format="appended" offset="463928" />
>    
>    
>    NumberOfComponents="1" format="appended" offset="580860" />
>     NumberOfComponents="1" format="appended" offset="836864" />
>     NumberOfComponents="1" format="appended" offset="868868" />
>    
>    
>    format="appended" offset="872872" />
>    
>    
>    NumberOfComponents="1" format="appended" offset="76" />
>    
>  
>    
>    
> 
> 
>  Thanks,
>  Mike
> 
> 
> > On Sun, Feb 12, 2023 at 6:15 PM Mike Michell 
> > wrote:
> >
> >> Dear PETSc team,
> >>
> >> I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
> >> object. To print out output variable and mesh connectivity, I use
> VecView()
> >> by defining PetscSection on that dm and borrow a vector. The type
> of the
> >> viewer is set to PETSCVIEWERVTK.
> >>
> >> With 32bit indices, the above work flow has no issue. However, if
> >> PETSc is configured with 64bit indices, my output .vtu file has an
> error if
> >> I open the file with visualization tools, such as Paraview or
> Tecplot,
> >> saying that:
> >> "Cannot read cell connectivity from Cells in piece 0 because the
> >> "offsets" array is not monotonically increasing or starts with a
> value
> >> other than 0."
> >>
> >> If I open the .vtu file from terminal, I can see such a line:
> >> ...
> >>  >> format="appended" offset="580860" />
> >> ...
> >>
> >> I expected "DataArray type="Int64", since the PETSc has 64bit
> indices.
> >> Could I get recommendations that I need to check to resolve the
> issue?
> >>
> >
> > This is probably a bug. We will look at it.
> >
> > Jed, I saw that Int32 is hardcoded in plexvtu.c, but sizeof(PetscInt)
> > is used to calculate the offset, which looks inconsistent. Can you
> take a
> > look?
> >
> >   Thanks,
> >
> >  Matt
> >
> >
> >> Thanks,
> >> Mike
> >>
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely 

Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Jed Brown
Can you share a reproducer? I think I recall the format requiring certain 
things to be Int32.

Mike Michell  writes:

> Thanks for the note.
> I understood that PETSc calculates the offsets for me through "boffset"
> variable in plexvtu.c file. Please correct me if it is wrong.
>
> If plexvtu.c has a bug, it could be around "write file header" part in
> which the boffset is also computed. Is this correct? I am not using complex
> number.
> There are several mixed parts among "Int32, UInt8, PetscInt_FMT,
> PetscInt64_FMT" in writing the header.
>
> Which combination of those flags is correct for 64bit indices? I am gonna
> modify plexvtu.c file with "#if defined(PETSC_USE_64BIT_INDICES)"
> statement, but I do not know what is the correct form of the header flag
> for 64bit indices.
>
> It is also confusing to me:
> boffset += gpiece[r].ncells * sizeof(PetscInt) + sizeof(int);
> How is sizeof(PetscInt) different from sizeof(int)?
>
> Thanks,
> Mike
>
>
>> On Tue, Feb 14, 2023 at 11:45 AM Mike Michell 
>> wrote:
>>
>>> I was trying to modify the header flags from "Int32" to "Int64", but the
>>> problem was not resolved. Could I get any additional comments?
>>>
>>
>> The calculated offsets are not correct I think.
>>
>>   Matt
>>
>>
>>> Thanks,
>>> Mike
>>>
>>>
 Thanks for the comments.
 To be precise on the question, the entire part of the header of the .vtu
 file is attached:

 
 
   
 
   
 >>> format="appended" offset="0" />
   
   
 >>> NumberOfComponents="1" format="appended" offset="116932" />
 >>>  NumberOfComponents="1" format="appended" offset="372936" />
 >>>  NumberOfComponents="1" format="appended" offset="404940" />
   
   
 >>> format="appended" offset="408944" />
   
   
 >>> NumberOfComponents="1" format="appended" offset="424948" />
   
 
 
   
 >>> format="appended" offset="463928" />
   
   
 >>> NumberOfComponents="1" format="appended" offset="580860" />
 >>>  NumberOfComponents="1" format="appended" offset="836864" />
 >>>  NumberOfComponents="1" format="appended" offset="868868" />
   
   
 >>> format="appended" offset="872872" />
   
   
 >>> NumberOfComponents="1" format="appended" offset="76" />
   
 
   
   


 Thanks,
 Mike


> On Sun, Feb 12, 2023 at 6:15 PM Mike Michell 
> wrote:
>
>> Dear PETSc team,
>>
>> I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
>> object. To print out output variable and mesh connectivity, I use 
>> VecView()
>> by defining PetscSection on that dm and borrow a vector. The type of the
>> viewer is set to PETSCVIEWERVTK.
>>
>> With 32bit indices, the above work flow has no issue. However, if
>> PETSc is configured with 64bit indices, my output .vtu file has an error 
>> if
>> I open the file with visualization tools, such as Paraview or Tecplot,
>> saying that:
>> "Cannot read cell connectivity from Cells in piece 0 because the
>> "offsets" array is not monotonically increasing or starts with a value
>> other than 0."
>>
>> If I open the .vtu file from terminal, I can see such a line:
>> ...
>> > format="appended" offset="580860" />
>> ...
>>
>> I expected "DataArray type="Int64", since the PETSc has 64bit indices.
>> Could I get recommendations that I need to check to resolve the issue?
>>
>
> This is probably a bug. We will look at it.
>
> Jed, I saw that Int32 is hardcoded in plexvtu.c, but sizeof(PetscInt)
> is used to calculate the offset, which looks inconsistent. Can you take a
> look?
>
>   Thanks,
>
>  Matt
>
>
>> Thanks,
>> Mike
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>

>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>


Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Mike Michell
Thanks for the note.
I understood that PETSc calculates the offsets for me through "boffset"
variable in plexvtu.c file. Please correct me if it is wrong.

If plexvtu.c has a bug, it could be around "write file header" part in
which the boffset is also computed. Is this correct? I am not using complex
number.
There are several mixed parts among "Int32, UInt8, PetscInt_FMT,
PetscInt64_FMT" in writing the header.

Which combination of those flags is correct for 64bit indices? I am gonna
modify plexvtu.c file with "#if defined(PETSC_USE_64BIT_INDICES)"
statement, but I do not know what is the correct form of the header flag
for 64bit indices.

It is also confusing to me:
boffset += gpiece[r].ncells * sizeof(PetscInt) + sizeof(int);
How is sizeof(PetscInt) different from sizeof(int)?

Thanks,
Mike


> On Tue, Feb 14, 2023 at 11:45 AM Mike Michell 
> wrote:
>
>> I was trying to modify the header flags from "Int32" to "Int64", but the
>> problem was not resolved. Could I get any additional comments?
>>
>
> The calculated offsets are not correct I think.
>
>   Matt
>
>
>> Thanks,
>> Mike
>>
>>
>>> Thanks for the comments.
>>> To be precise on the question, the entire part of the header of the .vtu
>>> file is attached:
>>>
>>> 
>>> 
>>>   
>>> 
>>>   
>>> >> format="appended" offset="0" />
>>>   
>>>   
>>> >> NumberOfComponents="1" format="appended" offset="116932" />
>>> >>  NumberOfComponents="1" format="appended" offset="372936" />
>>> >>  NumberOfComponents="1" format="appended" offset="404940" />
>>>   
>>>   
>>> >> format="appended" offset="408944" />
>>>   
>>>   
>>> >> NumberOfComponents="1" format="appended" offset="424948" />
>>>   
>>> 
>>> 
>>>   
>>> >> format="appended" offset="463928" />
>>>   
>>>   
>>> >> NumberOfComponents="1" format="appended" offset="580860" />
>>> >>  NumberOfComponents="1" format="appended" offset="836864" />
>>> >>  NumberOfComponents="1" format="appended" offset="868868" />
>>>   
>>>   
>>> >> format="appended" offset="872872" />
>>>   
>>>   
>>> >> NumberOfComponents="1" format="appended" offset="76" />
>>>   
>>> 
>>>   
>>>   
>>>
>>>
>>> Thanks,
>>> Mike
>>>
>>>
 On Sun, Feb 12, 2023 at 6:15 PM Mike Michell 
 wrote:

> Dear PETSc team,
>
> I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
> object. To print out output variable and mesh connectivity, I use 
> VecView()
> by defining PetscSection on that dm and borrow a vector. The type of the
> viewer is set to PETSCVIEWERVTK.
>
> With 32bit indices, the above work flow has no issue. However, if
> PETSc is configured with 64bit indices, my output .vtu file has an error 
> if
> I open the file with visualization tools, such as Paraview or Tecplot,
> saying that:
> "Cannot read cell connectivity from Cells in piece 0 because the
> "offsets" array is not monotonically increasing or starts with a value
> other than 0."
>
> If I open the .vtu file from terminal, I can see such a line:
> ...
>  format="appended" offset="580860" />
> ...
>
> I expected "DataArray type="Int64", since the PETSc has 64bit indices.
> Could I get recommendations that I need to check to resolve the issue?
>

 This is probably a bug. We will look at it.

 Jed, I saw that Int32 is hardcoded in plexvtu.c, but sizeof(PetscInt)
 is used to calculate the offset, which looks inconsistent. Can you take a
 look?

   Thanks,

  Matt


> Thanks,
> Mike
>


 --
 What most experimenters take for granted before they begin their
 experiments is infinitely more interesting than any results to which their
 experiments lead.
 -- Norbert Wiener

 https://www.cse.buffalo.edu/~knepley/
 

>>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Matthew Knepley
On Tue, Feb 14, 2023 at 11:45 AM Mike Michell  wrote:

> I was trying to modify the header flags from "Int32" to "Int64", but the
> problem was not resolved. Could I get any additional comments?
>

The calculated offsets are not correct I think.

  Matt


> Thanks,
> Mike
>
>
>> Thanks for the comments.
>> To be precise on the question, the entire part of the header of the .vtu
>> file is attached:
>>
>> 
>> 
>>   
>> 
>>   
>> > format="appended" offset="0" />
>>   
>>   
>> > NumberOfComponents="1" format="appended" offset="116932" />
>> >  NumberOfComponents="1" format="appended" offset="372936" />
>> >  NumberOfComponents="1" format="appended" offset="404940" />
>>   
>>   
>> > format="appended" offset="408944" />
>>   
>>   
>> > NumberOfComponents="1" format="appended" offset="424948" />
>>   
>> 
>> 
>>   
>> > format="appended" offset="463928" />
>>   
>>   
>> > NumberOfComponents="1" format="appended" offset="580860" />
>> >  NumberOfComponents="1" format="appended" offset="836864" />
>> >  NumberOfComponents="1" format="appended" offset="868868" />
>>   
>>   
>> > format="appended" offset="872872" />
>>   
>>   
>> > NumberOfComponents="1" format="appended" offset="76" />
>>   
>> 
>>   
>>   
>>
>>
>> Thanks,
>> Mike
>>
>>
>>> On Sun, Feb 12, 2023 at 6:15 PM Mike Michell 
>>> wrote:
>>>
 Dear PETSc team,

 I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
 object. To print out output variable and mesh connectivity, I use VecView()
 by defining PetscSection on that dm and borrow a vector. The type of the
 viewer is set to PETSCVIEWERVTK.

 With 32bit indices, the above work flow has no issue. However, if PETSc
 is configured with 64bit indices, my output .vtu file has an error if I
 open the file with visualization tools, such as Paraview or Tecplot, saying
 that:
 "Cannot read cell connectivity from Cells in piece 0 because the
 "offsets" array is not monotonically increasing or starts with a value
 other than 0."

 If I open the .vtu file from terminal, I can see such a line:
 ...
 >>> format="appended" offset="580860" />
 ...

 I expected "DataArray type="Int64", since the PETSc has 64bit indices.
 Could I get recommendations that I need to check to resolve the issue?

>>>
>>> This is probably a bug. We will look at it.
>>>
>>> Jed, I saw that Int32 is hardcoded in plexvtu.c, but sizeof(PetscInt) is
>>> used to calculate the offset, which looks inconsistent. Can you take a look?
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
 Thanks,
 Mike

>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> 
>>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] MatFDColoringSetUp with Periodic BC

2023-02-14 Thread Barry Smith

  I have created an MR that documents this and moved the error checking to a 
more appropriate location https://gitlab.com/petsc/petsc/-/merge_requests/6070


> On Feb 14, 2023, at 4:33 AM, Pierre Bernigaud  
> wrote:
> 
> Dear all, 
> 
> I hope this email finds you well. 
> We are currently working on a solver which is employing DMDA with SNES. The 
> jacobian is computed via FDColoring, ie: 
> 
> call DMDACreate1D(PETSC_COMM_WORLD, DM_BOUNDARY_GHOSTED, NY, NC, NGc, 
> PETSC_NULL_INTEGER, dmF, ierr)
> 
> ! - Some steps ... 
> 
> call DMCreateColoring(dmF, IS_COLORING_GLOBAL, iscoloring, ierr)
> call MatFDColoringCreate(Jac,iscoloring, matfdcoloring, ierr)
> call MatFDColoringSetFunction(matfdcoloring, FormFunction, CTX, ierr)
> call MatFDColoringSetUp(Jac ,iscoloring,matfdcoloring, ierr)
> call SNESSetJacobian(snes, Jac, Jac, SNESComputeJacobianDefaultColor, 
> matfdcoloring, ierr)
> 
> Everything is running smoothly. 
> Recently, we modified the boundary conditions such as to use periodic BC: 
> 
> call DMDACreate1D(PETSC_COMM_WORLD, DM_BOUNDARY_PERIODIC, NY, NC, NGc, 
> PETSC_NULL_INTEGER, dmF, ierr)
> 
> We then encountered frequent crashes when calling MatFDColoringSetUp, 
> depending on the number of cells NY. After looking for an solution, I found 
> this old thread: 
> https://lists.mcs.anl.gov/pipermail/petsc-users/2013-May/017449.html 
> It appears that when using periodic BC, FDColoring can only be used if the 
> number of cells is divisible by 2*NGc+1. Even though this is only a slight 
> annoyance, I was wondering if you were working on this matter / if you had a 
> quick fix at hand? At any rate, I think it would be nice if a warning was 
> displayed in the FDColoring documentation?  
> 
> Respectfully, 
> Pierre Bernigaud 
> 



Re: [petsc-users] Help with fieldsplit performance

2023-02-14 Thread Edoardo alinovi
Hi Matt,

So I have done some research these days and I have found out that I might
try to assemble the SIMPLE for Schur approximation (myS = A11 - A10
inv(DIAGFORM(A00)) A01).

Reading papers around, I come up with a doubt, which I believe to be a very
silly one but worth asking...

Is the way the unknowns are packed in the matrix relevant for schur
preconditioning?

I was studying a bit ex70.c, there the block matrix is defined like:

A = [A00 A10
   A10  A11]
Where A00 is the momentum equation matrix, A11 is the pressure equation
matrix, while A01 and A10 are the matrices for the coupling terms (i.e.
pressure gradient and continuity). The unknowns are x = [u1..uN v1...vN
w1...wN p1...pN]^T

In my case, I assemble the matrix cell by cell (FV method), and the result
will be this one:

[image: image.png]

Then I split the fields giving index 0-1 for u and 2 for p. I guess Petsc
is already doing the correct handling picking up the *a^33s* to assemble
A11, but worth being 100% sure :)

Thank you!


Re: [petsc-users] PetscViewer with 64bit

2023-02-14 Thread Mike Michell
I was trying to modify the header flags from "Int32" to "Int64", but the
problem was not resolved. Could I get any additional comments?

Thanks,
Mike


> Thanks for the comments.
> To be precise on the question, the entire part of the header of the .vtu
> file is attached:
>
> 
> 
>   
> 
>   
>  format="appended" offset="0" />
>   
>   
>  format="appended" offset="116932" />
>  format="appended" offset="372936" />
>  format="appended" offset="404940" />
>   
>   
>  format="appended" offset="408944" />
>   
>   
>  NumberOfComponents="1" format="appended" offset="424948" />
>   
> 
> 
>   
>  format="appended" offset="463928" />
>   
>   
>  format="appended" offset="580860" />
>  format="appended" offset="836864" />
>  format="appended" offset="868868" />
>   
>   
>  format="appended" offset="872872" />
>   
>   
>  NumberOfComponents="1" format="appended" offset="76" />
>   
> 
>   
>   
>
>
> Thanks,
> Mike
>
>
>> On Sun, Feb 12, 2023 at 6:15 PM Mike Michell 
>> wrote:
>>
>>> Dear PETSc team,
>>>
>>> I am a user of PETSc with Fortran. My code uses DMPlex to handle dm
>>> object. To print out output variable and mesh connectivity, I use VecView()
>>> by defining PetscSection on that dm and borrow a vector. The type of the
>>> viewer is set to PETSCVIEWERVTK.
>>>
>>> With 32bit indices, the above work flow has no issue. However, if PETSc
>>> is configured with 64bit indices, my output .vtu file has an error if I
>>> open the file with visualization tools, such as Paraview or Tecplot, saying
>>> that:
>>> "Cannot read cell connectivity from Cells in piece 0 because the
>>> "offsets" array is not monotonically increasing or starts with a value
>>> other than 0."
>>>
>>> If I open the .vtu file from terminal, I can see such a line:
>>> ...
>>> >> format="appended" offset="580860" />
>>> ...
>>>
>>> I expected "DataArray type="Int64", since the PETSc has 64bit indices.
>>> Could I get recommendations that I need to check to resolve the issue?
>>>
>>
>> This is probably a bug. We will look at it.
>>
>> Jed, I saw that Int32 is hardcoded in plexvtu.c, but sizeof(PetscInt) is
>> used to calculate the offset, which looks inconsistent. Can you take a look?
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Thanks,
>>> Mike
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>
>


Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-14 Thread Matthew Knepley
On Tue, Feb 14, 2023 at 8:36 AM Nicolas Barnafi 
wrote:

> Hello Matt,
>
> After some discussions elsewhere (thanks @LMitchell!), we found out that
> the problem is that the fields are setup with PCSetIS, without an attached
> DM, which does not support this kind of nesting fields.
>
> I would like to add this feature, meaning that during the setup of the
> preconditioner there should be an additional routine when there is no dm
> that reads _%i_fields and sets the corresponding fields to the sub PCs, in
> some order.
>
> My plan would be to do so at the PCSetUp_FieldSplit level. The idea is
> that whenever some IS fields are set such as 'a' and 'b', it should be
> possible to rearrange them with '-pc_fieldsplit_0_fields a,b' , or at least
> support this with numbered fields.
>
> How do you see it?
>

Just to clarify, if you call SetIS() 3 times, and then give

  -pc_fieldsplit_0_fields 0,2

then we should reduce the number of fields to two by calling
ISConcatenate() on the first and last ISes?

I think this should not be hard. It will work exactly as it does on the DM
case, except the ISes will come from
the PC, not the DM. One complication is that you will have to hold the new
ISes until the end, and then set them.

   Thanks,

 Matt


> Best,
> NB
>
> On 06/02/23 17:57, Matthew Knepley wrote:
>
> On Mon, Feb 6, 2023 at 11:45 AM Nicolas Barnafi 
> wrote:
>
>> Thank you Matt,
>>
>> Again, at the bottom of this message you will find the -info output. I
>> don't see any output related to the fields,
>>
>
> If the splits were done automatically, you would see an info message from
> here:
>
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L1595
>
> Thus it must be setup here
>
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L380
>
> There are info statements, but you do not see them, I do not see a way
> around using a small example
> to understand how you are setting up the system, since it is working as
> expected in the PETSc examples.
>
>   Thanks,
>
>   Matt
>
>
>> Best
>>
>>
>> -- -info
>>
>> [0]  PetscDetermineInitialFPTrap(): Floating point trapping is on by
>> default 13
>> [0]  PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType
>> host available, initializing
>> [0]  PetscDeviceInitializeTypeFromOptions_Private(): PetscDevice
>> host initialized, default device id 0, view FALSE, init type lazy
>> [0]  PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType
>> cuda not available
>> [0]  PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType
>> hip not available
>> [0]  PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType
>> sycl not available
>> [0]  PetscInitialize_Common(): PETSc successfully started: number of
>> processors = 1
>> [0]  PetscGetHostName(): Rejecting domainname, likely is NIS
>> nico-santech.(none)
>> [0]  PetscInitialize_Common(): Running on machine: nico-santech
>> [0]  SlepcInitialize(): SLEPc successfully started
>> [0]  PetscCommDuplicate(): Duplicating a communicator 94770066936960
>> 94770087780768 max tags = 2147483647
>> [0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc
>> communicator embedded in a user MPI_Comm 94770087780768
>> [0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960
>> is being unlinked from inner PETSc comm 94770087780768
>> [0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>> [0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an
>> MPI_Comm 94770087780768
>> [0]  PetscCommDuplicate(): Duplicating a communicator 94770066936960
>> 94770087780768 max tags = 2147483647
>> [0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc
>> communicator embedded in a user MPI_Comm 94770087780768
>> [0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960
>> is being unlinked from inner PETSc comm 94770087780768
>> [0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>> [0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an
>> MPI_Comm 94770087780768
>> [0]  PetscCommDuplicate(): Duplicating a communicator 94770066936960
>> 94770087780768 max tags = 2147483647
>> [0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc
>> communicator embedded in a user MPI_Comm 94770087780768
>> [0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960
>> is being unlinked from inner PETSc comm 94770087780768
>> [0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
>> [0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an
>> MPI_Comm 94770087780768
>> [0]  PetscCommDuplicate(): Duplicating a communicator 94770066936960
>> 94770087780768 max tags = 2147483647
>> [0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc
>> communicator embedded in a user MPI_Comm 94770087780768
>> [0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960
>> is being unlinked from inner PETSc comm 94770087780768
>> [0]  

Re: [petsc-users] Segregated solvers in PETSc

2023-02-14 Thread Matthew Knepley
On Tue, Feb 14, 2023 at 9:25 AM Miguel Angel Salazar de Troya <
miguel.sala...@corintis.com> wrote:

> Hello,
>
> I am solving the Navier-Stokes equation and an advection-diffusion
> equation to model the temperature. They are fully coupled because the
> viscosity is temperature dependent. I plan to solve the fully-coupled
> problem with a segregated approach: I first solve the Navier-Stokes
> equation for a fixed temperature and feed the velocity to the thermal
> equation, then use that temperature back in the Navier-Stokes equation to
> solve for the velocity again until I reach convergence. If I assemble the
> residual and jacobian for the fully coupled system with the proper fields
> section for the fieldsplit preconditioner (I am using Firedrake), is there
> a way to tell PETSc to solve the problem with a segregated approach?
>

For a linear problem, this is easy using PCFIELDSPLIT. Thus you could use
this to solve the Newton system for your problem. Doing this for a
nonlinear problem is still harder because the top-level PETSc interface
does not have a way to assemble subsets of the nonlinear problem. If you
use a DMPlex to express the problem _and_ its callbacks to express the
physics, then we can split the nonlinear problem into pieces. I am working
with the Firedrake folks to develop a general interface for this. Hopefully
we finish this year.

Right now, I think the easiest way to do the nonlinear thing is to write
separate UFL for the two problems and control the loop yourself, or to just
use the linear variation.

  Thanks,

 Matt


> Thanks,
> Miguel
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


[petsc-users] Segregated solvers in PETSc

2023-02-14 Thread Miguel Angel Salazar de Troya
Hello,

I am solving the Navier-Stokes equation and an advection-diffusion equation
to model the temperature. They are fully coupled because the viscosity is
temperature dependent. I plan to solve the fully-coupled problem with a
segregated approach: I first solve the Navier-Stokes equation for a fixed
temperature and feed the velocity to the thermal equation, then use that
temperature back in the Navier-Stokes equation to solve for the velocity
again until I reach convergence. If I assemble the residual and jacobian
for the fully coupled system with the proper fields section for the
fieldsplit preconditioner (I am using Firedrake), is there a way to tell
PETSc to solve the problem with a segregated approach?

Thanks,
Miguel


Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-14 Thread Nicolas Barnafi via petsc-users

Hello Matt,

After some discussions elsewhere (thanks @LMitchell!), we found out that 
the problem is that the fields are setup with PCSetIS, without an 
attached DM, which does not support this kind of nesting fields.


I would like to add this feature, meaning that during the setup of the 
preconditioner there should be an additional routine when there is no dm 
that reads _%i_fields and sets the corresponding fields to the sub PCs, 
in some order.


My plan would be to do so at the PCSetUp_FieldSplit level. The idea is 
that whenever some IS fields are set such as 'a' and 'b', it should be 
possible to rearrange them with '-pc_fieldsplit_0_fields a,b' , or at 
least support this with numbered fields.


How do you see it?

Best,
NB

On 06/02/23 17:57, Matthew Knepley wrote:
On Mon, Feb 6, 2023 at 11:45 AM Nicolas Barnafi 
 wrote:


Thank you Matt,

Again, at the bottom of this message you will find the -info
output. I don't see any output related to the fields,


If the splits were done automatically, you would see an info message 
from here:


https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L1595

Thus it must be setup here

https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L380

There are info statements, but you do not see them, I do not see a way 
around using a small example
to understand how you are setting up the system, since it is working 
as expected in the PETSc examples.


  Thanks,

      Matt

Best


-- -info

[0]  PetscDetermineInitialFPTrap(): Floating point trapping
is on by default 13
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType host available, initializing
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDevice host initialized, default device id 0, view FALSE,
init type lazy
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType cuda not available
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType hip not available
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType sycl not available
[0]  PetscInitialize_Common(): PETSc successfully started:
number of processors = 1
[0]  PetscGetHostName(): Rejecting domainname, likely is NIS
nico-santech.(none)
[0]  PetscInitialize_Common(): Running on machine: nico-santech
[0]  SlepcInitialize(): SLEPc successfully started
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 

[petsc-users] MatFDColoringSetUp with Periodic BC

2023-02-14 Thread Pierre Bernigaud

Dear all,

I hope this email finds you well.
We are currently working on a solver which is employing DMDA with SNES. 
The jacobian is computed via FDColoring, ie:


call DMDACreate1D(PETSC_COMM_WORLD, DM_BOUNDARY_GHOSTED, NY, NC, NGc, 
PETSC_NULL_INTEGER, dmF, ierr)


! - Some steps ...

call DMCreateColoring(dmF, IS_COLORING_GLOBAL, iscoloring, ierr)
call MatFDColoringCreate(Jac,iscoloring, matfdcoloring, ierr)
call MatFDColoringSetFunction(matfdcoloring, FormFunction, CTX, ierr)
call MatFDColoringSetUp(Jac ,iscoloring,matfdcoloring, ierr)
call SNESSetJacobian(snes, Jac, Jac, SNESComputeJacobianDefaultColor, 
matfdcoloring, ierr)


Everything is running smoothly.
Recently, we modified the boundary conditions such as to use periodic 
BC:


call DMDACreate1D(PETSC_COMM_WORLD, DM_BOUNDARY_PERIODIC, NY, NC, NGc, 
PETSC_NULL_INTEGER, dmF, ierr)


We then encountered frequent crashes when calling MatFDColoringSetUp, 
depending on the number of cells NY. After looking for an solution, I 
found this old thread: 
https://lists.mcs.anl.gov/pipermail/petsc-users/2013-May/017449.html
It appears that when using periodic BC, FDColoring can only be used if 
the number of cells is divisible by 2*NGc+1. Even though this is only a 
slight annoyance, I was wondering if you were working on this matter / 
if you had a quick fix at hand? At any rate, I think it would be nice if 
a warning was displayed in the FDColoring documentation?


Respectfully,
Pierre Bernigaud