Thanks very much Matt for the detailed explanations.
I was asking about the Schur complement because I have tried a "manual" version
of this procedure without the field split. Eventually, it needs the solution of
three linear systems, just like A^{-1} u. If you have the LU of A, then
petsc main in debug mode has some additional checks for this cases. Can you
run with the main branch and configure petsc using --with-debugging=1?
Il giorno mar 23 gen 2024 alle ore 22:35 Barry Smith ha
scritto:
>
>This could happen if the values in the vector get changed but the
>
This could happen if the values in the vector get changed but the
PetscObjectState does not get updated. Normally this is impossible, any action
that changes a vectors values changes its state (so for example calling
VecGetArray()/VecRestoreArray() updates the state.
Are you accessing
Do you have an example to reproduce it?
--Junchao Zhang
On Tue, Jan 23, 2024 at 10:49 AM wrote:
> Hello,
>
> I have used the GMRES solver in PETSc successfully up to now, but on
> installing the most recent release, 3.20.3, the solver fails by exiting
> early. Output from the code is:
>
>
Hello,
I have used the GMRES solver in PETSc successfully up to now, but on
installing the most recent release, 3.20.3, the solver fails by exiting
early. Output from the code is:
lt-nbi-solve-laplace: starting PETSc solver [23.0537]
0 KSP Residual norm < 1.e-11
Linear solve converged due
On Tue, Jan 23, 2024 at 11:06 AM Pantelis Moschopoulos <
pmoschopou...@outlook.com> wrote:
> Dear Matt,
>
> Thank you for your explanation. The new methodology is straightforward to
> implement.
> Still, I have one more question . When I use the option
> -pc_fieldsplit_schur_precondition full,
Dear Matt,
Thank you for your explanation. The new methodology is straightforward to
implement.
Still, I have one more question . When I use the option
-pc_fieldsplit_schur_precondition full, PETSc computes internally the exact
Schur complement matrix representation. Based on the example
On Tue, Jan 23, 2024 at 9:45 AM Pantelis Moschopoulos <
pmoschopou...@outlook.com> wrote:
> Dear Matt,
>
> I read about the MATLRC. However, its correct usage is not clear to me so
> I have the following questions:
>
>1. The U and V input matrices should be created as dense using
>
Dear Matt,
I read about the MATLRC. However, its correct usage is not clear to me so I
have the following questions:
1. The U and V input matrices should be created as dense using
MatCreateDense?
2.
I use the command MatCreateLRC just to declare the matrix and then
MatLRCSetMats to pass
On Tue, Jan 23, 2024 at 8:16 AM Pantelis Moschopoulos <
pmoschopou...@outlook.com> wrote:
> Dear Matt,
>
> Thank you for your response. This is an idealized setup where I have only
> one row/column. Sometimes we might need two or even three constraints based
> on the application. Thus, I will
Dear Matt,
Thank you for your response. This is an idealized setup where I have only one
row/column. Sometimes we might need two or even three constraints based on the
application. Thus, I will pursue the user-defined IS.
When I supply the IS using the command PCFieldSplitSetIS, I do not
On Tue, Jan 23, 2024 at 4:23 AM Pantelis Moschopoulos <
pmoschopou...@outlook.com> wrote:
> Dear Matt and Dear Barry,
>
> I have some follow up questions regarding FieldSplit.
> Let's assume that I solve again the 3D Stokes flow but now I have also a
> global constraint that controls the flow
Dear Matt,
Please find attached a test for writing a DMPlex with hanging nodes,
which is based on a refined DMForest. I've linked the code with the
current main git version of Petsc.
When the DMPlex gets written to disc, the code crashes with
[0]PETSC ERROR: Unknown discretization type for
Dear Matt and Dear Barry,
I have some follow up questions regarding FieldSplit.
Let's assume that I solve again the 3D Stokes flow but now I have also a global
constraint that controls the flow rate at the inlet. Now, the matrix has the
same unknowns as before, i.e.
14 matches
Mail list logo