Hi Nikhil,

Typically, you would grab the "raw" underlying Mat object and do the PETSc
call yourself. This would be accessible from the libMesh::PetscMatrix
object. Typically the ImplicitSystem has the system matrix, but I'm not
familiar with where these matrices would be cached on the RB side of
things. I would use the Oxygen documentation as a starting point:
libmesh.github.io (although it seems GitHub is misbehaving ATM). I hope
that helps.

Best,

Paul

On Thu, Apr 23, 2020 at 12:06 AM Nikhil Vaidya <nikhilvaidy...@gmail.com>
wrote:

> Hi,
>
> I am using reduced basis for my work. The number of Aq matrices in my
> problem is very large (~250). Because of this, the memory requirements of
> my program are ~200 times the mesh-file size. I did some memory usage study
> and found out that the allocate_data_structures() function in
> RBConstruction is responsible for most of the memory usage.
>
> I think this might be due to the large number of Petsc sparse matrices
> being constructed. Each Aq matrix is known to have non-zero entries
> corresponding to only a small sub-domain of the geometry. All these
> sub-domains are separate blocks in the mesh. I get the feeling that for
> each sparse matrix memory is being allocated even for the zero entries. I
> saw that Petsc has an option MAT_IGNORE_ZERO_ENTRIES. How does one pass
> this in libMesh? Please note that I am not using libMesh directly. My code
> is a MOOSE app, and I am using the RB functionality of the underlying
> libmesh code.
>
> Best regards,
> Nikhil
>
> _______________________________________________
> Libmesh-users mailing list
> Libmesh-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/libmesh-users
>

_______________________________________________
Libmesh-users mailing list
Libmesh-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to