Hello,

Thanks for your response.

@David
Reducing the number of Aq matrices is not really ideal for my application.
I would like to try and see if I can reduce the memory requirement in some
other way.

@Paul
Indeed PetscMatrix is used for Reduced Basis also. I went through the
PetscMatrix class documentation, and tried to insert the flag
MAT_IGNORE_ZERO_ENTRIES in the MatSetOption function. It turns out that
that flag does not work if the matrix is pre-allocated.

Since pre-allocation of the matrix is necessary for good performance, I do
not want to disable it. Is it possible to construct a sparsity pattern only
for a particular subdomain of the mesh and then attach it to a sparse
matrix? If this works, I think it will solve my problem.

Best regards,
Nikhil

On Thu, Apr 23, 2020 at 3:22 PM Paul T. Bauman <ptbau...@gmail.com> wrote:

> Hi Nikhil,
>
> Typically, you would grab the "raw" underlying Mat object and do the PETSc
> call yourself. This would be accessible from the libMesh::PetscMatrix
> object. Typically the ImplicitSystem has the system matrix, but I'm not
> familiar with where these matrices would be cached on the RB side of
> things. I would use the Oxygen documentation as a starting point:
> libmesh.github.io (although it seems GitHub is misbehaving ATM). I hope
> that helps.
>
> Best,
>
> Paul
>
> On Thu, Apr 23, 2020 at 12:06 AM Nikhil Vaidya <nikhilvaidy...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am using reduced basis for my work. The number of Aq matrices in my
>> problem is very large (~250). Because of this, the memory requirements of
>> my program are ~200 times the mesh-file size. I did some memory usage
>> study
>> and found out that the allocate_data_structures() function in
>> RBConstruction is responsible for most of the memory usage.
>>
>> I think this might be due to the large number of Petsc sparse matrices
>> being constructed. Each Aq matrix is known to have non-zero entries
>> corresponding to only a small sub-domain of the geometry. All these
>> sub-domains are separate blocks in the mesh. I get the feeling that for
>> each sparse matrix memory is being allocated even for the zero entries. I
>> saw that Petsc has an option MAT_IGNORE_ZERO_ENTRIES. How does one pass
>> this in libMesh? Please note that I am not using libMesh directly. My code
>> is a MOOSE app, and I am using the RB functionality of the underlying
>> libmesh code.
>>
>> Best regards,
>> Nikhil
>>
>> _______________________________________________
>> Libmesh-users mailing list
>> Libmesh-users@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/libmesh-users
>>
>

_______________________________________________
Libmesh-users mailing list
Libmesh-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to