Re: [petsc-users] question on PCLSC with matrix blocks of type MATNEST

2024-02-08 Thread Zhang, Hong via petsc-users
Hana, "product AB with A nest, B nest" is not supported by PETSc. I do not know why PETSc does not display such an error message. I'll check it. Hong From: petsc-users on behalf of Hana Honnerová Sent: Thursday, February 8, 2024 4:45 AM To:

Re: [petsc-users] Unique number in each element of a DMPlex mesh

2024-02-08 Thread Matthew Knepley
On Thu, Feb 8, 2024 at 9:54 AM Berend van Wachem wrote: > Dear Matt, > > I have now written a code to transform a DMPlex coming from a DMForest > to a DMPlex of type DM_POLY, by removing the double faces and edges. > Also, I can successfully write this transformed DMPlex of type DM_POLY > to a

Re: [petsc-users] Unique number in each element of a DMPlex mesh

2024-02-08 Thread Berend van Wachem
Dear Matt, I have now written a code to transform a DMPlex coming from a DMForest to a DMPlex of type DM_POLY, by removing the double faces and edges. Also, I can successfully write this transformed DMPlex of type DM_POLY to a file. However, the code to perform the transformation only works

Re: [petsc-users] PETSc crashes when different rank sets row, col and A values using MatCreateSeqAIJWithArrays

2024-02-08 Thread Barry Smith
No, it uses the exact layout you provided. You can use https://petsc.org/release/manualpages/PC/PCREDISTRIBUTE/#pcredistribute to have the solver redistribute the rows to have an equal number per MPI process during the solve process, which will give you the effect you are looking for.

Re: [petsc-users] PETSc crashes when different rank sets row, col and A values using MatCreateSeqAIJWithArrays

2024-02-08 Thread Junchao Zhang
On Thu, Feb 8, 2024 at 3:15 AM Maruthi NH wrote: > Hi Barry, > Thanks. Yes, the global column index was wrong. I have one more question > regarding MatCreateMPIAIJWithArrays. If I have 100 elements in rank 0 and > 50 in rank 1, does PETSc redistribute equally among procs before solving? > No,

[petsc-users] question on PCLSC with matrix blocks of type MATNEST

2024-02-08 Thread Hana Honnerová
Hi all, I am trying to solve linear systems arising from isogeometric discretization (similar to FEM) of the Navier-Stokes equations in parallel using PETSc. The linear systems are of saddle-point type, so I would like to use the PCFIELDSPLIT preconditioner with the

Re: [petsc-users] PETSc crashes when different rank sets row, col and A values using MatCreateSeqAIJWithArrays

2024-02-08 Thread Maruthi NH
Hi Barry, Thanks. Yes, the global column index was wrong. I have one more question regarding MatCreateMPIAIJWithArrays. If I have 100 elements in rank 0 and 50 in rank 1, does PETSc redistribute equally among procs before solving? Regards, Maruthi On Mon, Feb 5, 2024 at 2:18 AM Barry Smith