Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Nicholas Arnold-Medabalimi
Hi

Thanks so much for all your help. I've gotten all the core tech working in
C++ and am working on the Fortran integration. Before I do that, I've been
doing some memory checks using Valgrind to ensure everything is acceptable
since I've been seeing random memory corruption errors for specific mesh
sizes. I'm getting an invalid write memory error associated with a
DMCreateGlobalVector call. I presume I have something small in my section
assignment causing this, but I would appreciate any insight.  This is more
or less ripped directly from plex tutorial example 7.

PetscErrorCode addSectionToDM(DM , Vec )
{
int p0, p1, c0, c1;
DMPlexGetHeightStratum(dm, 0, , );

DMPlexGetChart(dm, , );
PetscSection section_full;
PetscSectionCreate(PETSC_COMM_WORLD, _full);
PetscSectionSetNumFields(section_full, 2);
PetscSectionSetChart(section_full, p0, p1);
PetscSectionSetFieldName(section_full, 0, "Pressure");
PetscSectionSetFieldName(section_full, 1, "Temperature");

for (int i = c0; i < c1; i++)
{
PetscSectionSetDof(section_full, i, 2);
PetscSectionSetFieldDof(section_full, i, 0, 1);
PetscSectionSetFieldDof(section_full, i, 1, 1);
}
PetscSectionSetUp(section_full);
DMSetNumFields(dm, 2);
DMSetLocalSection(dm, section_full);
PetscSectionDestroy(_full);
DMCreateGlobalVector(dm, );

return 0;
}

results in
==12603== Invalid write of size 8
==12603==at 0x10CD8B: main (redistribute_filter.cpp:254)
==12603==  Address 0xb9fe800 is 4,816 bytes inside a block of size 4,820
alloc'd
==12603==at 0x483E0F0: memalign (in
/usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==12603==by 0x483E212: posix_memalign (in
/usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==12603==by 0x4C4DAB0: PetscMallocAlign (mal.c:54)
==12603==by 0x4C5262F: PetscTrMallocDefault (mtr.c:186)
==12603==by 0x4C501F7: PetscMallocA (mal.c:420)
==12603==by 0x527E8A9: VecCreate_MPI_Private (pbvec.c:485)
==12603==by 0x527F04F: VecCreate_MPI (pbvec.c:523)
==12603==by 0x53E7097: VecSetType (vecreg.c:89)
==12603==by 0x527FBC8: VecCreate_Standard (pbvec.c:547)
==12603==by 0x53E7097: VecSetType (vecreg.c:89)
==12603==by 0x6CD77C0: DMCreateGlobalVector_Section_Private (dmi.c:58)
==12603==by 0x61D52DB: DMCreateGlobalVector_Plex (plexcreate.c:4130)


Sincerely
Nicholas



On Mon, Dec 26, 2022 at 10:52 AM Matthew Knepley  wrote:

> On Mon, Dec 26, 2022 at 10:49 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Oh it's not a worry. I'm debugging this first in C++, and once it's
>> working I don't actually need to view what's happening in Fortran when I
>> move over. In my C debugging code. After I create the distribution vector
>> and distribute the field based on your input, I'm adding
>>
>> VecSetOperation(state_dist, VECOP_VIEW, (void(*)(void))VecView_Plex);
>>
>> But I can't compile it because VecView_Plex is undefined. Thanks
>>
>
> You need
>
> #include 
>
>   Thanks
>
>  Matt
>
>
>> Sincerely
>> Nicholas
>>
>>
>>
>> On Mon, Dec 26, 2022 at 10:45 AM Matthew Knepley 
>> wrote:
>>
>>> On Mon, Dec 26, 2022 at 10:40 AM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 Hi Matt

 1) I'm not sure I follow how to call this. If I insert the
 VecSetOperation call I'm not exactly sure what the VecView_Plex is or where
 it is defined?

>>>
>>> Shoot, this cannot be done in Fortran. I will rewrite this step to get
>>> it working for you. It should have been done anyway. I cannot do it
>>> today since I have to grade finals, but I should have it this week.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
 2) Otherwise I've solved this problem with the insight you provided
 into the local section. Things look good on the ASCII output but if we can
 resolve 1 then I think the loop is fully closed and I can just worry about
 the fortran translation.

 Thanks again for all your help.

 Sincerely
 Nicholas



 On Mon, Dec 26, 2022 at 9:37 AM Matthew Knepley 
 wrote:

> On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Hi Matt
>>
>> I was able to get this all squared away. It turns out I was
>> initializing the viewer incorrectly—my mistake. However, there is a
>> follow-up question. A while back, we discussed distributing a vector 
>> field
>> from an initial DM to a new distributed DM. The way you said to do this 
>> was
>>
>> // Distribute the submesh with overlap of 1
>> DMPlexDistribute(sub_da, overlap, , _da_dist);
>> //Create a vector and section for the distribution
>> VecCreate(PETSC_COMM_WORLD, _dist);
>> VecSetDM(state_dist, sub_da_dist);
>> PetscSectionCreate(PETSC_COMM_WORLD, );
>> DMSetLocalSection(sub_da_dist, distSection);
>>  

Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Matthew Knepley
On Mon, Dec 26, 2022 at 10:49 AM Nicholas Arnold-Medabalimi <
narno...@umich.edu> wrote:

> Oh it's not a worry. I'm debugging this first in C++, and once it's
> working I don't actually need to view what's happening in Fortran when I
> move over. In my C debugging code. After I create the distribution vector
> and distribute the field based on your input, I'm adding
>
> VecSetOperation(state_dist, VECOP_VIEW, (void(*)(void))VecView_Plex);
>
> But I can't compile it because VecView_Plex is undefined. Thanks
>

You need

#include 

  Thanks

 Matt


> Sincerely
> Nicholas
>
>
>
> On Mon, Dec 26, 2022 at 10:45 AM Matthew Knepley 
> wrote:
>
>> On Mon, Dec 26, 2022 at 10:40 AM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Hi Matt
>>>
>>> 1) I'm not sure I follow how to call this. If I insert the
>>> VecSetOperation call I'm not exactly sure what the VecView_Plex is or where
>>> it is defined?
>>>
>>
>> Shoot, this cannot be done in Fortran. I will rewrite this step to get it
>> working for you. It should have been done anyway. I cannot do it
>> today since I have to grade finals, but I should have it this week.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> 2) Otherwise I've solved this problem with the insight you provided into
>>> the local section. Things look good on the ASCII output but if we can
>>> resolve 1 then I think the loop is fully closed and I can just worry about
>>> the fortran translation.
>>>
>>> Thanks again for all your help.
>>>
>>> Sincerely
>>> Nicholas
>>>
>>>
>>>
>>> On Mon, Dec 26, 2022 at 9:37 AM Matthew Knepley 
>>> wrote:
>>>
 On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
 narno...@umich.edu> wrote:

> Hi Matt
>
> I was able to get this all squared away. It turns out I was
> initializing the viewer incorrectly—my mistake. However, there is a
> follow-up question. A while back, we discussed distributing a vector field
> from an initial DM to a new distributed DM. The way you said to do this 
> was
>
> // Distribute the submesh with overlap of 1
> DMPlexDistribute(sub_da, overlap, , _da_dist);
> //Create a vector and section for the distribution
> VecCreate(PETSC_COMM_WORLD, _dist);
> VecSetDM(state_dist, sub_da_dist);
> PetscSectionCreate(PETSC_COMM_WORLD, );
> DMSetLocalSection(sub_da_dist, distSection);
> DMPlexDistributeField(sub_da_dist, distributionSF, filteredSection,
> state_filtered, distSection, state_dist);
>   I've forgone Fortran to debug this all in C and then integrate the
> function calls into the Fortran code.
>
> There are two questions here.
>
> 1) How do I associate a vector associated with a DM using VecSetDM to
> output properly as a VTK? When I call VecView at present, if I call 
> VecView
> on state_dist, it will not output anything.
>

 This is a problem. The different pieces of interface were added at
 different times. We should really move that manipulation of the function
 table into VecSetDM(). Here is the code:


 https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/plexcreate.c#L4135

 You can make the overload call yourself for now, until we decide on the
 best fix.


> 2) The visualization is nice, but when I look at the Vec of the
> distributed field using stdout, something isn't distributing correctly, as
> the vector still has some uninitialized values. This is apparent if I
> output the original vector and the distributed vector. Examining the 
> inside
> of DMPlexDistributeField I suspect I'm making a mistake with the sections
> I'm passing. filtered section in this case is the global section but if I
> try the local section I get an error so I'm not sure.
>

 These should definitely be local sections. Global sections are always
 built after the fact, and building the global section needs the SF that
 indicates what points are shared, not the distribution SF that moves
 points. I need to go back and put in checks that all the arguments are the
 right type. Thanks for bringing that up. Lets track down the error for
 local sections.

   Matt


> *Original Vector(state_filtered)*
> Vec Object: Vec_0x8404_1 2 MPI processes
>   type: mpi
> Process [0]
> 101325.
> 300.
> 101326.
> 301.
> 101341.
> 316.
> Process [1]
> 101325.
> 300.
> 101326.
> 301.
> 101345.
> 320.
> 101497.
> 472.
> 101516.
> 491.
> *Re-Distributed Vector (state_dist) *
> Vec Object: 2 MPI processes
>   type: mpi
> Process [0]
> 101325.
> 300.
> 101326.
> 301.
> 101341.
> 316.
> 7.90505e-323
> 1.97626e-323
> 4.30765e-312
> 6.91179e-310
> Process [1]
> 101497.
> 472.
> 101516.
> 491.
> 1.99665e-314

Re: [petsc-users] gamg out of memory with gpu

2022-12-26 Thread Matthew Knepley
On Mon, Dec 26, 2022 at 10:29 AM Edoardo Centofanti <
edoardo.centofant...@universitadipavia.it> wrote:

> Thank you for your answer. Can you provide me the full path of the example
> you have in mind? The one I found does not seem to exploit the algebraic
> multigrid, but just the geometric one.
>

cd $PETSC_DIR/src/snes/tutorials/ex5
./ex5 -da_grid_x 64 -da_grid_y 64 -mms 3 -pc_type gang

and for GPUs I think you need the options to move things over

  -dm_vec_type cuda -dm_mat_type aijcusparse

  Thanks,

 Matt


> Thanks,
> Edoardo
>
> Il giorno lun 26 dic 2022 alle ore 15:39 Matthew Knepley <
> knep...@gmail.com> ha scritto:
>
>> On Mon, Dec 26, 2022 at 4:41 AM Edoardo Centofanti <
>> edoardo.centofant...@universitadipavia.it> wrote:
>>
>>> Hi PETSc Users,
>>>
>>> I am experimenting some issues with the GAMG precondtioner when used
>>> with GPU.
>>> In particular, it seems to go out of memory very easily (around 5000
>>> dofs are enough to make it throw the "[0]PETSC ERROR: cuda error 2
>>> (cudaErrorMemoryAllocation) : out of memory" error).
>>> I have these issues both with single and multiple GPUs (on the same or
>>> on different nodes). The exact same problems work like a charm with HYPRE
>>> BoomerAMG on GPUs.
>>> With both preconditioners I exploit the device acceleration by giving
>>> the usual command line options "-dm_vec_type cuda" and "-dm_mat_type
>>> aijcusparse" (I am working with structured meshes). My PETSc version is
>>> 3.17.
>>>
>>> Is this a known issue of the GAMG preconditioner?
>>>
>>
>> No. Can you get it to do this with a PETSc example? Say SNES ex5?
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Thank you in advance,
>>> Edoardo
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> 
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Nicholas Arnold-Medabalimi
Oh it's not a worry. I'm debugging this first in C++, and once it's working
I don't actually need to view what's happening in Fortran when I move over.
In my C debugging code. After I create the distribution vector and
distribute the field based on your input, I'm adding

VecSetOperation(state_dist, VECOP_VIEW, (void(*)(void))VecView_Plex);

But I can't compile it because VecView_Plex is undefined. Thanks

Sincerely
Nicholas



On Mon, Dec 26, 2022 at 10:45 AM Matthew Knepley  wrote:

> On Mon, Dec 26, 2022 at 10:40 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Hi Matt
>>
>> 1) I'm not sure I follow how to call this. If I insert the
>> VecSetOperation call I'm not exactly sure what the VecView_Plex is or where
>> it is defined?
>>
>
> Shoot, this cannot be done in Fortran. I will rewrite this step to get it
> working for you. It should have been done anyway. I cannot do it
> today since I have to grade finals, but I should have it this week.
>
>   Thanks,
>
>  Matt
>
>
>> 2) Otherwise I've solved this problem with the insight you provided into
>> the local section. Things look good on the ASCII output but if we can
>> resolve 1 then I think the loop is fully closed and I can just worry about
>> the fortran translation.
>>
>> Thanks again for all your help.
>>
>> Sincerely
>> Nicholas
>>
>>
>>
>> On Mon, Dec 26, 2022 at 9:37 AM Matthew Knepley 
>> wrote:
>>
>>> On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 Hi Matt

 I was able to get this all squared away. It turns out I was
 initializing the viewer incorrectly—my mistake. However, there is a
 follow-up question. A while back, we discussed distributing a vector field
 from an initial DM to a new distributed DM. The way you said to do this was

 // Distribute the submesh with overlap of 1
 DMPlexDistribute(sub_da, overlap, , _da_dist);
 //Create a vector and section for the distribution
 VecCreate(PETSC_COMM_WORLD, _dist);
 VecSetDM(state_dist, sub_da_dist);
 PetscSectionCreate(PETSC_COMM_WORLD, );
 DMSetLocalSection(sub_da_dist, distSection);
 DMPlexDistributeField(sub_da_dist, distributionSF, filteredSection,
 state_filtered, distSection, state_dist);
   I've forgone Fortran to debug this all in C and then integrate the
 function calls into the Fortran code.

 There are two questions here.

 1) How do I associate a vector associated with a DM using VecSetDM to
 output properly as a VTK? When I call VecView at present, if I call VecView
 on state_dist, it will not output anything.

>>>
>>> This is a problem. The different pieces of interface were added at
>>> different times. We should really move that manipulation of the function
>>> table into VecSetDM(). Here is the code:
>>>
>>>
>>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/plexcreate.c#L4135
>>>
>>> You can make the overload call yourself for now, until we decide on the
>>> best fix.
>>>
>>>
 2) The visualization is nice, but when I look at the Vec of the
 distributed field using stdout, something isn't distributing correctly, as
 the vector still has some uninitialized values. This is apparent if I
 output the original vector and the distributed vector. Examining the inside
 of DMPlexDistributeField I suspect I'm making a mistake with the sections
 I'm passing. filtered section in this case is the global section but if I
 try the local section I get an error so I'm not sure.

>>>
>>> These should definitely be local sections. Global sections are always
>>> built after the fact, and building the global section needs the SF that
>>> indicates what points are shared, not the distribution SF that moves
>>> points. I need to go back and put in checks that all the arguments are the
>>> right type. Thanks for bringing that up. Lets track down the error for
>>> local sections.
>>>
>>>   Matt
>>>
>>>
 *Original Vector(state_filtered)*
 Vec Object: Vec_0x8404_1 2 MPI processes
   type: mpi
 Process [0]
 101325.
 300.
 101326.
 301.
 101341.
 316.
 Process [1]
 101325.
 300.
 101326.
 301.
 101345.
 320.
 101497.
 472.
 101516.
 491.
 *Re-Distributed Vector (state_dist) *
 Vec Object: 2 MPI processes
   type: mpi
 Process [0]
 101325.
 300.
 101326.
 301.
 101341.
 316.
 7.90505e-323
 1.97626e-323
 4.30765e-312
 6.91179e-310
 Process [1]
 101497.
 472.
 101516.
 491.
 1.99665e-314
 8.14714e-321


 Any insight on distributing this field and correcting the error would
 be appreciated.

 Sincerely and Happy Holiday
 Nicholas



 On Fri, Dec 23, 2022 at 10:57 AM Matthew Knepley 
 wrote:

> On Thu, Dec 22, 2022 at 12:41 AM Nicholas 

Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Matthew Knepley
On Mon, Dec 26, 2022 at 10:40 AM Nicholas Arnold-Medabalimi <
narno...@umich.edu> wrote:

> Hi Matt
>
> 1) I'm not sure I follow how to call this. If I insert the VecSetOperation
> call I'm not exactly sure what the VecView_Plex is or where it is defined?
>

Shoot, this cannot be done in Fortran. I will rewrite this step to get it
working for you. It should have been done anyway. I cannot do it
today since I have to grade finals, but I should have it this week.

  Thanks,

 Matt


> 2) Otherwise I've solved this problem with the insight you provided into
> the local section. Things look good on the ASCII output but if we can
> resolve 1 then I think the loop is fully closed and I can just worry about
> the fortran translation.
>
> Thanks again for all your help.
>
> Sincerely
> Nicholas
>
>
>
> On Mon, Dec 26, 2022 at 9:37 AM Matthew Knepley  wrote:
>
>> On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Hi Matt
>>>
>>> I was able to get this all squared away. It turns out I was initializing
>>> the viewer incorrectly—my mistake. However, there is a follow-up question.
>>> A while back, we discussed distributing a vector field from an initial DM
>>> to a new distributed DM. The way you said to do this was
>>>
>>> // Distribute the submesh with overlap of 1
>>> DMPlexDistribute(sub_da, overlap, , _da_dist);
>>> //Create a vector and section for the distribution
>>> VecCreate(PETSC_COMM_WORLD, _dist);
>>> VecSetDM(state_dist, sub_da_dist);
>>> PetscSectionCreate(PETSC_COMM_WORLD, );
>>> DMSetLocalSection(sub_da_dist, distSection);
>>> DMPlexDistributeField(sub_da_dist, distributionSF, filteredSection,
>>> state_filtered, distSection, state_dist);
>>>   I've forgone Fortran to debug this all in C and then integrate the
>>> function calls into the Fortran code.
>>>
>>> There are two questions here.
>>>
>>> 1) How do I associate a vector associated with a DM using VecSetDM to
>>> output properly as a VTK? When I call VecView at present, if I call VecView
>>> on state_dist, it will not output anything.
>>>
>>
>> This is a problem. The different pieces of interface were added at
>> different times. We should really move that manipulation of the function
>> table into VecSetDM(). Here is the code:
>>
>>
>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/plexcreate.c#L4135
>>
>> You can make the overload call yourself for now, until we decide on the
>> best fix.
>>
>>
>>> 2) The visualization is nice, but when I look at the Vec of the
>>> distributed field using stdout, something isn't distributing correctly, as
>>> the vector still has some uninitialized values. This is apparent if I
>>> output the original vector and the distributed vector. Examining the inside
>>> of DMPlexDistributeField I suspect I'm making a mistake with the sections
>>> I'm passing. filtered section in this case is the global section but if I
>>> try the local section I get an error so I'm not sure.
>>>
>>
>> These should definitely be local sections. Global sections are always
>> built after the fact, and building the global section needs the SF that
>> indicates what points are shared, not the distribution SF that moves
>> points. I need to go back and put in checks that all the arguments are the
>> right type. Thanks for bringing that up. Lets track down the error for
>> local sections.
>>
>>   Matt
>>
>>
>>> *Original Vector(state_filtered)*
>>> Vec Object: Vec_0x8404_1 2 MPI processes
>>>   type: mpi
>>> Process [0]
>>> 101325.
>>> 300.
>>> 101326.
>>> 301.
>>> 101341.
>>> 316.
>>> Process [1]
>>> 101325.
>>> 300.
>>> 101326.
>>> 301.
>>> 101345.
>>> 320.
>>> 101497.
>>> 472.
>>> 101516.
>>> 491.
>>> *Re-Distributed Vector (state_dist) *
>>> Vec Object: 2 MPI processes
>>>   type: mpi
>>> Process [0]
>>> 101325.
>>> 300.
>>> 101326.
>>> 301.
>>> 101341.
>>> 316.
>>> 7.90505e-323
>>> 1.97626e-323
>>> 4.30765e-312
>>> 6.91179e-310
>>> Process [1]
>>> 101497.
>>> 472.
>>> 101516.
>>> 491.
>>> 1.99665e-314
>>> 8.14714e-321
>>>
>>>
>>> Any insight on distributing this field and correcting the error would be
>>> appreciated.
>>>
>>> Sincerely and Happy Holiday
>>> Nicholas
>>>
>>>
>>>
>>> On Fri, Dec 23, 2022 at 10:57 AM Matthew Knepley 
>>> wrote:
>>>
 On Thu, Dec 22, 2022 at 12:41 AM Nicholas Arnold-Medabalimi <
 narno...@umich.edu> wrote:

> Hi Petsc Users
>
> I've been having trouble consistently getting a vector generated from
> a DM to output to VTK correctly. I've used ex1.c (which works properly)to
> try and figure it out, but I'm still having some issues. I must be missing
> something small that isn't correctly associating the section with the DM.
>
> DMPlexGetChart(dm, , );
> PetscSection section_full;
> PetscSectionCreate(PETSC_COMM_WORLD, _full);
> PetscSectionSetNumFields(section_full, 1);
> PetscSectionSetChart(section_full, p0, p1);
> 

Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Nicholas Arnold-Medabalimi
Hi Matt

1) I'm not sure I follow how to call this. If I insert the VecSetOperation
call I'm not exactly sure what the VecView_Plex is or where it is defined?

2) Otherwise I've solved this problem with the insight you provided into
the local section. Things look good on the ASCII output but if we can
resolve 1 then I think the loop is fully closed and I can just worry about
the fortran translation.

Thanks again for all your help.

Sincerely
Nicholas



On Mon, Dec 26, 2022 at 9:37 AM Matthew Knepley  wrote:

> On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Hi Matt
>>
>> I was able to get this all squared away. It turns out I was initializing
>> the viewer incorrectly—my mistake. However, there is a follow-up question.
>> A while back, we discussed distributing a vector field from an initial DM
>> to a new distributed DM. The way you said to do this was
>>
>> // Distribute the submesh with overlap of 1
>> DMPlexDistribute(sub_da, overlap, , _da_dist);
>> //Create a vector and section for the distribution
>> VecCreate(PETSC_COMM_WORLD, _dist);
>> VecSetDM(state_dist, sub_da_dist);
>> PetscSectionCreate(PETSC_COMM_WORLD, );
>> DMSetLocalSection(sub_da_dist, distSection);
>> DMPlexDistributeField(sub_da_dist, distributionSF, filteredSection,
>> state_filtered, distSection, state_dist);
>>   I've forgone Fortran to debug this all in C and then integrate the
>> function calls into the Fortran code.
>>
>> There are two questions here.
>>
>> 1) How do I associate a vector associated with a DM using VecSetDM to
>> output properly as a VTK? When I call VecView at present, if I call VecView
>> on state_dist, it will not output anything.
>>
>
> This is a problem. The different pieces of interface were added at
> different times. We should really move that manipulation of the function
> table into VecSetDM(). Here is the code:
>
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/plexcreate.c#L4135
>
> You can make the overload call yourself for now, until we decide on the
> best fix.
>
>
>> 2) The visualization is nice, but when I look at the Vec of the
>> distributed field using stdout, something isn't distributing correctly, as
>> the vector still has some uninitialized values. This is apparent if I
>> output the original vector and the distributed vector. Examining the inside
>> of DMPlexDistributeField I suspect I'm making a mistake with the sections
>> I'm passing. filtered section in this case is the global section but if I
>> try the local section I get an error so I'm not sure.
>>
>
> These should definitely be local sections. Global sections are always
> built after the fact, and building the global section needs the SF that
> indicates what points are shared, not the distribution SF that moves
> points. I need to go back and put in checks that all the arguments are the
> right type. Thanks for bringing that up. Lets track down the error for
> local sections.
>
>   Matt
>
>
>> *Original Vector(state_filtered)*
>> Vec Object: Vec_0x8404_1 2 MPI processes
>>   type: mpi
>> Process [0]
>> 101325.
>> 300.
>> 101326.
>> 301.
>> 101341.
>> 316.
>> Process [1]
>> 101325.
>> 300.
>> 101326.
>> 301.
>> 101345.
>> 320.
>> 101497.
>> 472.
>> 101516.
>> 491.
>> *Re-Distributed Vector (state_dist) *
>> Vec Object: 2 MPI processes
>>   type: mpi
>> Process [0]
>> 101325.
>> 300.
>> 101326.
>> 301.
>> 101341.
>> 316.
>> 7.90505e-323
>> 1.97626e-323
>> 4.30765e-312
>> 6.91179e-310
>> Process [1]
>> 101497.
>> 472.
>> 101516.
>> 491.
>> 1.99665e-314
>> 8.14714e-321
>>
>>
>> Any insight on distributing this field and correcting the error would be
>> appreciated.
>>
>> Sincerely and Happy Holiday
>> Nicholas
>>
>>
>>
>> On Fri, Dec 23, 2022 at 10:57 AM Matthew Knepley 
>> wrote:
>>
>>> On Thu, Dec 22, 2022 at 12:41 AM Nicholas Arnold-Medabalimi <
>>> narno...@umich.edu> wrote:
>>>
 Hi Petsc Users

 I've been having trouble consistently getting a vector generated from a
 DM to output to VTK correctly. I've used ex1.c (which works properly)to try
 and figure it out, but I'm still having some issues. I must be missing
 something small that isn't correctly associating the section with the DM.

 DMPlexGetChart(dm, , );
 PetscSection section_full;
 PetscSectionCreate(PETSC_COMM_WORLD, _full);
 PetscSectionSetNumFields(section_full, 1);
 PetscSectionSetChart(section_full, p0, p1);
 PetscSectionSetFieldName(section_full, 0, "state");

 for (int i = c0; i < c1; i++)
 {
 PetscSectionSetDof(section_full, i, 1);
 PetscSectionSetFieldDof(section_full, i, 0, 1);
 }
 PetscSectionSetUp(section_full);
 DMSetNumFields(dm, 1);
 DMSetLocalSection(dm, section_full);
 DMCreateGlobalVector(dm, _full);

 int o0, o1;
 VecGetOwnershipRange(state_full, , );
 PetscScalar 

Re: [petsc-users] gamg out of memory with gpu

2022-12-26 Thread Edoardo Centofanti
Thank you for your answer. Can you provide me the full path of the example
you have in mind? The one I found does not seem to exploit the algebraic
multigrid, but just the geometric one.

Thanks,
Edoardo

Il giorno lun 26 dic 2022 alle ore 15:39 Matthew Knepley 
ha scritto:

> On Mon, Dec 26, 2022 at 4:41 AM Edoardo Centofanti <
> edoardo.centofant...@universitadipavia.it> wrote:
>
>> Hi PETSc Users,
>>
>> I am experimenting some issues with the GAMG precondtioner when used with
>> GPU.
>> In particular, it seems to go out of memory very easily (around 5000
>> dofs are enough to make it throw the "[0]PETSC ERROR: cuda error 2
>> (cudaErrorMemoryAllocation) : out of memory" error).
>> I have these issues both with single and multiple GPUs (on the same or on
>> different nodes). The exact same problems work like a charm with HYPRE
>> BoomerAMG on GPUs.
>> With both preconditioners I exploit the device acceleration by giving the
>> usual command line options "-dm_vec_type cuda" and "-dm_mat_type
>> aijcusparse" (I am working with structured meshes). My PETSc version is
>> 3.17.
>>
>> Is this a known issue of the GAMG preconditioner?
>>
>
> No. Can you get it to do this with a PETSc example? Say SNES ex5?
>
>   Thanks,
>
>  Matt
>
>
>> Thank you in advance,
>> Edoardo
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] gamg out of memory with gpu

2022-12-26 Thread Matthew Knepley
On Mon, Dec 26, 2022 at 4:41 AM Edoardo Centofanti <
edoardo.centofant...@universitadipavia.it> wrote:

> Hi PETSc Users,
>
> I am experimenting some issues with the GAMG precondtioner when used with
> GPU.
> In particular, it seems to go out of memory very easily (around 5000
> dofs are enough to make it throw the "[0]PETSC ERROR: cuda error 2
> (cudaErrorMemoryAllocation) : out of memory" error).
> I have these issues both with single and multiple GPUs (on the same or on
> different nodes). The exact same problems work like a charm with HYPRE
> BoomerAMG on GPUs.
> With both preconditioners I exploit the device acceleration by giving the
> usual command line options "-dm_vec_type cuda" and "-dm_mat_type
> aijcusparse" (I am working with structured meshes). My PETSc version is
> 3.17.
>
> Is this a known issue of the GAMG preconditioner?
>

No. Can you get it to do this with a PETSc example? Say SNES ex5?

  Thanks,

 Matt


> Thank you in advance,
> Edoardo
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Matthew Knepley
On Mon, Dec 26, 2022 at 3:21 AM Nicholas Arnold-Medabalimi <
narno...@umich.edu> wrote:

> Hi Matt
>
> I was able to get this all squared away. It turns out I was initializing
> the viewer incorrectly—my mistake. However, there is a follow-up question.
> A while back, we discussed distributing a vector field from an initial DM
> to a new distributed DM. The way you said to do this was
>
> // Distribute the submesh with overlap of 1
> DMPlexDistribute(sub_da, overlap, , _da_dist);
> //Create a vector and section for the distribution
> VecCreate(PETSC_COMM_WORLD, _dist);
> VecSetDM(state_dist, sub_da_dist);
> PetscSectionCreate(PETSC_COMM_WORLD, );
> DMSetLocalSection(sub_da_dist, distSection);
> DMPlexDistributeField(sub_da_dist, distributionSF, filteredSection,
> state_filtered, distSection, state_dist);
>   I've forgone Fortran to debug this all in C and then integrate the
> function calls into the Fortran code.
>
> There are two questions here.
>
> 1) How do I associate a vector associated with a DM using VecSetDM to
> output properly as a VTK? When I call VecView at present, if I call VecView
> on state_dist, it will not output anything.
>

This is a problem. The different pieces of interface were added at
different times. We should really move that manipulation of the function
table into VecSetDM(). Here is the code:


https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/plexcreate.c#L4135

You can make the overload call yourself for now, until we decide on the
best fix.


> 2) The visualization is nice, but when I look at the Vec of the
> distributed field using stdout, something isn't distributing correctly, as
> the vector still has some uninitialized values. This is apparent if I
> output the original vector and the distributed vector. Examining the inside
> of DMPlexDistributeField I suspect I'm making a mistake with the sections
> I'm passing. filtered section in this case is the global section but if I
> try the local section I get an error so I'm not sure.
>

These should definitely be local sections. Global sections are always built
after the fact, and building the global section needs the SF that indicates
what points are shared, not the distribution SF that moves points. I need
to go back and put in checks that all the arguments are the right type.
Thanks for bringing that up. Lets track down the error for local sections.

  Matt


> *Original Vector(state_filtered)*
> Vec Object: Vec_0x8404_1 2 MPI processes
>   type: mpi
> Process [0]
> 101325.
> 300.
> 101326.
> 301.
> 101341.
> 316.
> Process [1]
> 101325.
> 300.
> 101326.
> 301.
> 101345.
> 320.
> 101497.
> 472.
> 101516.
> 491.
> *Re-Distributed Vector (state_dist) *
> Vec Object: 2 MPI processes
>   type: mpi
> Process [0]
> 101325.
> 300.
> 101326.
> 301.
> 101341.
> 316.
> 7.90505e-323
> 1.97626e-323
> 4.30765e-312
> 6.91179e-310
> Process [1]
> 101497.
> 472.
> 101516.
> 491.
> 1.99665e-314
> 8.14714e-321
>
>
> Any insight on distributing this field and correcting the error would be
> appreciated.
>
> Sincerely and Happy Holiday
> Nicholas
>
>
>
> On Fri, Dec 23, 2022 at 10:57 AM Matthew Knepley 
> wrote:
>
>> On Thu, Dec 22, 2022 at 12:41 AM Nicholas Arnold-Medabalimi <
>> narno...@umich.edu> wrote:
>>
>>> Hi Petsc Users
>>>
>>> I've been having trouble consistently getting a vector generated from a
>>> DM to output to VTK correctly. I've used ex1.c (which works properly)to try
>>> and figure it out, but I'm still having some issues. I must be missing
>>> something small that isn't correctly associating the section with the DM.
>>>
>>> DMPlexGetChart(dm, , );
>>> PetscSection section_full;
>>> PetscSectionCreate(PETSC_COMM_WORLD, _full);
>>> PetscSectionSetNumFields(section_full, 1);
>>> PetscSectionSetChart(section_full, p0, p1);
>>> PetscSectionSetFieldName(section_full, 0, "state");
>>>
>>> for (int i = c0; i < c1; i++)
>>> {
>>> PetscSectionSetDof(section_full, i, 1);
>>> PetscSectionSetFieldDof(section_full, i, 0, 1);
>>> }
>>> PetscSectionSetUp(section_full);
>>> DMSetNumFields(dm, 1);
>>> DMSetLocalSection(dm, section_full);
>>> DMCreateGlobalVector(dm, _full);
>>>
>>> int o0, o1;
>>> VecGetOwnershipRange(state_full, , );
>>> PetscScalar *state_full_array;
>>> VecGetArray(state_full, _full_array);
>>>
>>> for (int i = 0; i < (c1 - c0); i++)
>>> {
>>> int offset;
>>> PetscSectionGetOffset(section_full, i, );
>>> state_full_array[offset] = 101325 + i;
>>> }
>>>
>>> VecRestoreArray(state_full, _full_array);
>>>
>>>
>>> PetscViewerCreate(PETSC_COMM_WORLD, );
>>> PetscViewerSetType(viewer, PETSCVIEWERVTK);
>>> PetscViewerFileSetMode(viewer, FILE_MODE_WRITE);
>>> PetscViewerFileSetName(viewer, "mesh.vtu");
>>> VecView(state_full, viewer);
>>>
>>> If I run this mesh.vtu isn't generated at all. If I instead do a DMView
>>> passing the DM it will 

[petsc-users] gamg out of memory with gpu

2022-12-26 Thread Edoardo Centofanti
Hi PETSc Users,

I am experimenting some issues with the GAMG precondtioner when used with
GPU.
In particular, it seems to go out of memory very easily (around 5000
dofs are enough to make it throw the "[0]PETSC ERROR: cuda error 2
(cudaErrorMemoryAllocation) : out of memory" error).
I have these issues both with single and multiple GPUs (on the same or on
different nodes). The exact same problems work like a charm with HYPRE
BoomerAMG on GPUs.
With both preconditioners I exploit the device acceleration by giving the
usual command line options "-dm_vec_type cuda" and "-dm_mat_type
aijcusparse" (I am working with structured meshes). My PETSc version is
3.17.

Is this a known issue of the GAMG preconditioner?

Thank you in advance,
Edoardo


Re: [petsc-users] Getting a vector from a DM to output VTK

2022-12-26 Thread Nicholas Arnold-Medabalimi
Hi Matt

I was able to get this all squared away. It turns out I was initializing
the viewer incorrectly—my mistake. However, there is a follow-up question.
A while back, we discussed distributing a vector field from an initial DM
to a new distributed DM. The way you said to do this was

// Distribute the submesh with overlap of 1
DMPlexDistribute(sub_da, overlap, , _da_dist);
//Create a vector and section for the distribution
VecCreate(PETSC_COMM_WORLD, _dist);
VecSetDM(state_dist, sub_da_dist);
PetscSectionCreate(PETSC_COMM_WORLD, );
DMSetLocalSection(sub_da_dist, distSection);
DMPlexDistributeField(sub_da_dist, distributionSF, filteredSection,
state_filtered, distSection, state_dist);
  I've forgone Fortran to debug this all in C and then integrate the
function calls into the Fortran code.

There are two questions here.

1) How do I associate a vector associated with a DM using VecSetDM to
output properly as a VTK? When I call VecView at present, if I call VecView
on state_dist, it will not output anything.

2) The visualization is nice, but when I look at the Vec of the distributed
field using stdout, something isn't distributing correctly, as the vector
still has some uninitialized values. This is apparent if I output the
original vector and the distributed vector. Examining the inside of
DMPlexDistributeField I suspect I'm making a mistake with the sections I'm
passing. filtered section in this case is the global section but if I try
the local section I get an error so I'm not sure.


*Original Vector(state_filtered)*
Vec Object: Vec_0x8404_1 2 MPI processes
  type: mpi
Process [0]
101325.
300.
101326.
301.
101341.
316.
Process [1]
101325.
300.
101326.
301.
101345.
320.
101497.
472.
101516.
491.
*Re-Distributed Vector (state_dist) *
Vec Object: 2 MPI processes
  type: mpi
Process [0]
101325.
300.
101326.
301.
101341.
316.
7.90505e-323
1.97626e-323
4.30765e-312
6.91179e-310
Process [1]
101497.
472.
101516.
491.
1.99665e-314
8.14714e-321


Any insight on distributing this field and correcting the error would be
appreciated.

Sincerely and Happy Holiday
Nicholas



On Fri, Dec 23, 2022 at 10:57 AM Matthew Knepley  wrote:

> On Thu, Dec 22, 2022 at 12:41 AM Nicholas Arnold-Medabalimi <
> narno...@umich.edu> wrote:
>
>> Hi Petsc Users
>>
>> I've been having trouble consistently getting a vector generated from a
>> DM to output to VTK correctly. I've used ex1.c (which works properly)to try
>> and figure it out, but I'm still having some issues. I must be missing
>> something small that isn't correctly associating the section with the DM.
>>
>> DMPlexGetChart(dm, , );
>> PetscSection section_full;
>> PetscSectionCreate(PETSC_COMM_WORLD, _full);
>> PetscSectionSetNumFields(section_full, 1);
>> PetscSectionSetChart(section_full, p0, p1);
>> PetscSectionSetFieldName(section_full, 0, "state");
>>
>> for (int i = c0; i < c1; i++)
>> {
>> PetscSectionSetDof(section_full, i, 1);
>> PetscSectionSetFieldDof(section_full, i, 0, 1);
>> }
>> PetscSectionSetUp(section_full);
>> DMSetNumFields(dm, 1);
>> DMSetLocalSection(dm, section_full);
>> DMCreateGlobalVector(dm, _full);
>>
>> int o0, o1;
>> VecGetOwnershipRange(state_full, , );
>> PetscScalar *state_full_array;
>> VecGetArray(state_full, _full_array);
>>
>> for (int i = 0; i < (c1 - c0); i++)
>> {
>> int offset;
>> PetscSectionGetOffset(section_full, i, );
>> state_full_array[offset] = 101325 + i;
>> }
>>
>> VecRestoreArray(state_full, _full_array);
>>
>>
>> PetscViewerCreate(PETSC_COMM_WORLD, );
>> PetscViewerSetType(viewer, PETSCVIEWERVTK);
>> PetscViewerFileSetMode(viewer, FILE_MODE_WRITE);
>> PetscViewerFileSetName(viewer, "mesh.vtu");
>> VecView(state_full, viewer);
>>
>> If I run this mesh.vtu isn't generated at all. If I instead do a DMView
>> passing the DM it will just output the mesh correctly.
>>
>> Any assistance would be greatly appreciated.
>>
>
> DMCreateGlobalVector() dispatches to DMCreateGlobalVector_Plex(), which
> resets the view method to VecView_Plex(), which should dispatch to
> VecView_Plex_Local_VTK(). You can verify this in the debugger, or send us
> code we can run to verify it.
>
>   Thanks,
>
> Matt
>
>
>> Sincerely
>> Nicholas
>>
>>
>> --
>> Nicholas Arnold-Medabalimi
>>
>> Ph.D. Candidate
>> Computational Aeroscience Lab
>> University of Michigan
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


-- 
Nicholas Arnold-Medabalimi

Ph.D. Candidate
Computational Aeroscience Lab
University of Michigan