Hana,
"product AB with A nest, B nest" is not supported by PETSc. I do not know why
PETSc does not display such an error message. I'll check it.
Hong
From: petsc-users on behalf of Hana Honnerová
Sent: Thursday, February 8, 2024 4:45 AM
To:
On Thu, Feb 8, 2024 at 9:54 AM Berend van Wachem
wrote:
> Dear Matt,
>
> I have now written a code to transform a DMPlex coming from a DMForest
> to a DMPlex of type DM_POLY, by removing the double faces and edges.
> Also, I can successfully write this transformed DMPlex of type DM_POLY
> to a
Dear Matt,
I have now written a code to transform a DMPlex coming from a DMForest
to a DMPlex of type DM_POLY, by removing the double faces and edges.
Also, I can successfully write this transformed DMPlex of type DM_POLY
to a file.
However, the code to perform the transformation only works
No, it uses the exact layout you provided.
You can use
https://petsc.org/release/manualpages/PC/PCREDISTRIBUTE/#pcredistribute to have
the solver redistribute the rows to have an equal number per MPI process during
the solve process, which will give you the effect you are looking for.
On Thu, Feb 8, 2024 at 3:15 AM Maruthi NH wrote:
> Hi Barry,
> Thanks. Yes, the global column index was wrong. I have one more question
> regarding MatCreateMPIAIJWithArrays. If I have 100 elements in rank 0 and
> 50 in rank 1, does PETSc redistribute equally among procs before solving?
>
No,
Hi all,
I am trying to solve linear systems arising from isogeometric
discretization (similar to FEM) of the Navier-Stokes equations in
parallel using PETSc. The linear systems are of saddle-point type, so I
would like to use the PCFIELDSPLIT preconditioner with the
Hi Barry,
Thanks. Yes, the global column index was wrong. I have one more question
regarding MatCreateMPIAIJWithArrays. If I have 100 elements in rank 0 and
50 in rank 1, does PETSc redistribute equally among procs before solving?
Regards,
Maruthi
On Mon, Feb 5, 2024 at 2:18 AM Barry Smith