Great, thanks for letting us know. It'll merge to release shortly and thus be 
in petsc >= 3.20.3.

"Fackler, Philip" <fackle...@ornl.gov> writes:

> Jed,
>
> That seems to have worked (ridiculously well). It's now 55MB, and it's 
> happening in the call to MatSetPreallocationCOO.
>
> Thank you,
>
> Philip Fackler
> Research Software Engineer, Application Engineering Group
> Advanced Computing Systems Research Section
> Computer Science and Mathematics Division
> Oak Ridge National Laboratory
> ________________________________
> From: Jed Brown <j...@jedbrown.org>
> Sent: Thursday, December 14, 2023 16:27
> To: Fackler, Philip <fackle...@ornl.gov>; petsc-users@mcs.anl.gov 
> <petsc-users@mcs.anl.gov>; xolotl-psi-developm...@lists.sourceforge.net 
> <xolotl-psi-developm...@lists.sourceforge.net>
> Subject: [EXTERNAL] Re: [petsc-users] Call to DMSetMatrixPreallocateSkip not 
> changing allocation behavior
>
> I had a one-character typo in the diff above. This MR to release should work 
> now.
>
> https://urldefense.us/v2/url?u=https-3A__gitlab.com_petsc_petsc_-2D_merge-5Frequests_7120&d=DwIBAg&c=v4IIwRuZAmwupIjowmMWUmLasxPEgYsgNI-O7C4ViYc&r=DAkLCjn8leYU-uJ-kfNEQMhPZWx9lzc4d5KgIR-RZWQ&m=v9sHqomCGBRWotign4NcwYwOpszOJehUGs_EO3eGn4SSZqxnfK7Iv15-X8nO1lii&s=h_jIP-6WcIjR6LssfGrV6Z2DojlN_w7Me4-a4rBE074&e=
>
> Jed Brown <j...@jedbrown.org> writes:
>
>> 17 GB for a 1D DMDA, wow. :-)
>>
>> Could you try applying this diff to make it work for DMDA (it's currently 
>> handled by DMPlex)?
>>
>> diff --git i/src/dm/impls/da/fdda.c w/src/dm/impls/da/fdda.c
>> index cad4d926504..bd2a3bda635 100644
>> --- i/src/dm/impls/da/fdda.c
>> +++ w/src/dm/impls/da/fdda.c
>> @@ -675,19 +675,21 @@ PetscErrorCode DMCreateMatrix_DA(DM da, Mat *J)
>>     specialized setting routines depend only on the particular preallocation
>>     details of the matrix, not the type itself.
>>    */
>> -  PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPIAIJSetPreallocation_C", &aij));
>> -  if (!aij) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqAIJSetPreallocation_C", &aij));
>> -  if (!aij) {
>> -    PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPIBAIJSetPreallocation_C", &baij));
>> -    if (!baij) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqBAIJSetPreallocation_C", &baij));
>> -    if (!baij) {
>> -      PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPISBAIJSetPreallocation_C", &sbaij));
>> -      if (!sbaij) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqSBAIJSetPreallocation_C", &sbaij));
>> -      if (!sbaij) {
>> -        PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPISELLSetPreallocation_C", &sell));
>> -        if (!sell) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqSELLSetPreallocation_C", &sell));
>> +  if (!dm->prealloc_skip) { // Flag is likely set when user intends to use 
>> MatSetPreallocationCOO()
>> +    PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPIAIJSetPreallocation_C", &aij));
>> +    if (!aij) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqAIJSetPreallocation_C", &aij));
>> +    if (!aij) {
>> +      PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPIBAIJSetPreallocation_C", &baij));
>> +      if (!baij) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqBAIJSetPreallocation_C", &baij));
>> +      if (!baij) {
>> +        PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPISBAIJSetPreallocation_C", &sbaij));
>> +        if (!sbaij) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqSBAIJSetPreallocation_C", &sbaij));
>> +        if (!sbaij) {
>> +          PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatMPISELLSetPreallocation_C", &sell));
>> +          if (!sell) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatSeqSELLSetPreallocation_C", &sell));
>> +        }
>> +        if (!sell) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatISSetPreallocation_C", &is));
>>        }
>> -      if (!sell) PetscCall(PetscObjectQueryFunction((PetscObject)A, 
>> "MatISSetPreallocation_C", &is));
>>      }
>>    }
>>    if (aij) {
>>
>>
>> "Fackler, Philip via petsc-users" <petsc-users@mcs.anl.gov> writes:
>>
>>> I'm using the following sequence of functions related to the Jacobian 
>>> matrix:
>>>
>>> DMDACreate1d(..., &da);
>>> DMSetFromOptions(da);
>>> DMSetUp(da);
>>> DMSetMatType(da, MATAIJKOKKOS);
>>> DMSetMatrixPreallocateSkip(da, PETSC_TRUE);
>>> Mat J;
>>> DMCreateMatrix(da, &J);
>>> MatSetPreallocationCOO(J, ...);
>>>
>>> I recently added the call to DMSetMatrixPreallocateSkip, hoping the 
>>> allocation would be delayed to MatSetPreallocationCOO, and that it would 
>>> require less memory. The 
>>> documentation<https://urldefense.us/v2/url?u=https-3A__petsc.org_release_manualpages_DM_DMSetMatrixPreallocateSkip_&d=DwIBAg&c=v4IIwRuZAmwupIjowmMWUmLasxPEgYsgNI-O7C4ViYc&r=DAkLCjn8leYU-uJ-kfNEQMhPZWx9lzc4d5KgIR-RZWQ&m=v9sHqomCGBRWotign4NcwYwOpszOJehUGs_EO3eGn4SSZqxnfK7Iv15-X8nO1lii&s=IMLBs0ydxDPvuXeD6jmsq1BN_8oieHVSyG6VA9c0DyM&e=
>>>  > says that the data structures will not be preallocated. The following 
>>> data from heaptrack shows that the allocation is still happening in the 
>>> call to DMCreateMatrix.
>>>
>>> [cid:bda9ef12-a46f-47b2-9b9b-a4b2808b6b13]
>>>
>>> Can someone help me understand this?
>>>
>>> Thanks,
>>>
>>> Philip Fackler
>>> Research Software Engineer, Application Engineering Group
>>> Advanced Computing Systems Research Section
>>> Computer Science and Mathematics Division
>>> Oak Ridge National Laboratory

Reply via email to