Re: [petsc-users] Problem setting Fieldsplit fields

2023-04-04 Thread Nicolas Barnafi via petsc-users

Hi Matt,

One further question on this. I am working on the code now, but have one 
issue.


The IS I grab from all fields need to be set to the sub PCs, but they 
will have a local ordering of the dofs. Is there a tool in PETSc to make 
this coherent? I.e. if I will set the IS fields '0,4,7' to a a subPC 
object within the Fieldsplit structure, how to build a new PC such that 
setting


PCSetFieldIS(pc, '0', field[0])
PCSetFieldIS(pc, '1', field[4])
PCSetFieldIS(pc, '2', field[7])

makes sense in the new subPC?

Thanks for the help.

Best,
NB

On 22-02-23 14:54, Nicolas Barnafi wrote:

Hi Matt,

Sorry for the late answer, it was holiday time.


Just to clarify, if you call SetIS() 3 times, and then give

  -pc_fieldsplit_0_fields 0,2

then we should reduce the number of fields to two by calling 
ISConcatenate() on the first and last ISes?


Exactly

I think this should not be hard. It will work exactly as it does on 
the DM case, except the ISes will come from
the PC, not the DM. One complication is that you will have to hold the 
new ISes until the end, and then set them.


   Thanks,

     Matt


Nice, then it is exactly what I want. I will work on it, and create a PR 
when things are starting to fit in.


Best,
NB




Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-22 Thread Nicolas Barnafi via petsc-users

Hi Matt,

Sorry for the late answer, it was holiday time.


Just to clarify, if you call SetIS() 3 times, and then give

  -pc_fieldsplit_0_fields 0,2

then we should reduce the number of fields to two by calling 
ISConcatenate() on the first and last ISes?


Exactly

I think this should not be hard. It will work exactly as it does on 
the DM case, except the ISes will come from
the PC, not the DM. One complication is that you will have to hold the 
new ISes until the end, and then set them.


   Thanks,

     Matt


Nice, then it is exactly what I want. I will work on it, and create a PR 
when things are starting to fit in.


Best,
NB

Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-14 Thread Nicolas Barnafi via petsc-users

Hello Matt,

After some discussions elsewhere (thanks @LMitchell!), we found out that 
the problem is that the fields are setup with PCSetIS, without an 
attached DM, which does not support this kind of nesting fields.


I would like to add this feature, meaning that during the setup of the 
preconditioner there should be an additional routine when there is no dm 
that reads _%i_fields and sets the corresponding fields to the sub PCs, 
in some order.


My plan would be to do so at the PCSetUp_FieldSplit level. The idea is 
that whenever some IS fields are set such as 'a' and 'b', it should be 
possible to rearrange them with '-pc_fieldsplit_0_fields a,b' , or at 
least support this with numbered fields.


How do you see it?

Best,
NB

On 06/02/23 17:57, Matthew Knepley wrote:
On Mon, Feb 6, 2023 at 11:45 AM Nicolas Barnafi 
 wrote:


Thank you Matt,

Again, at the bottom of this message you will find the -info
output. I don't see any output related to the fields,


If the splits were done automatically, you would see an info message 
from here:


https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L1595

Thus it must be setup here

https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/pc/impls/fieldsplit/fieldsplit.c#L380

There are info statements, but you do not see them, I do not see a way 
around using a small example
to understand how you are setting up the system, since it is working 
as expected in the PETSc examples.


  Thanks,

      Matt

Best


-- -info

[0]  PetscDetermineInitialFPTrap(): Floating point trapping
is on by default 13
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType host available, initializing
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDevice host initialized, default device id 0, view FALSE,
init type lazy
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType cuda not available
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType hip not available
[0]  PetscDeviceInitializeTypeFromOptions_Private():
PetscDeviceType sycl not available
[0]  PetscInitialize_Common(): PETSc successfully started:
number of processors = 1
[0]  PetscGetHostName(): Rejecting domainname, likely is NIS
nico-santech.(none)
[0]  PetscInitialize_Common(): Running on machine: nico-santech
[0]  SlepcInitialize(): SLEPc successfully started
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 94770087780768
[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in
an MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to
PETSc communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm
94770066936960 is being unlinked from inner PETSc comm 

Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-06 Thread Nicolas Barnafi via petsc-users

Thank you Matt,

Again, at the bottom of this message you will find the -info output. I 
don't see any output related to the fields,


Best


-- -info

[0]  PetscDetermineInitialFPTrap(): Floating point trapping is on 
by default 13
[0]  PetscDeviceInitializeTypeFromOptions_Private(): 
PetscDeviceType host available, initializing
[0]  PetscDeviceInitializeTypeFromOptions_Private(): PetscDevice 
host initialized, default device id 0, view FALSE, init type lazy
[0]  PetscDeviceInitializeTypeFromOptions_Private(): 
PetscDeviceType cuda not available
[0]  PetscDeviceInitializeTypeFromOptions_Private(): 
PetscDeviceType hip not available
[0]  PetscDeviceInitializeTypeFromOptions_Private(): 
PetscDeviceType sycl not available
[0]  PetscInitialize_Common(): PETSc successfully started: number 
of processors = 1
[0]  PetscGetHostName(): Rejecting domainname, likely is NIS 
nico-santech.(none)

[0]  PetscInitialize_Common(): Running on machine: nico-santech
[0]  SlepcInitialize(): SLEPc successfully started
[0]  PetscCommDuplicate(): Duplicating a communicator 
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc 
communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960 
is being unlinked from inner PETSc comm 94770087780768

[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an 
MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator 
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc 
communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960 
is being unlinked from inner PETSc comm 94770087780768

[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an 
MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator 
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc 
communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960 
is being unlinked from inner PETSc comm 94770087780768

[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an 
MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator 
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc 
communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960 
is being unlinked from inner PETSc comm 94770087780768

[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an 
MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator 
94770066936960 94770087780768 max tags = 2147483647
[0]  Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc 
communicator embedded in a user MPI_Comm 94770087780768
[0]  Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 94770066936960 
is being unlinked from inner PETSc comm 94770087780768

[0]  PetscCommDestroy(): Deleting PETSc MPI_Comm 94770087780768
[0]  Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an 
MPI_Comm 94770087780768
[0]  PetscCommDuplicate(): Duplicating a communicator 
94770066936960 94770087780768 max tags = 2147483647
[0]  PetscCommDuplicate(): Using internal PETSc communicator 
94770066936960 94770087780768
[0]  MatAssemblyEnd_SeqAIJ(): Matrix size: 1219 X 1219; storage 
space: 0 unneeded,26443 used
[0]  MatAssemblyEnd_SeqAIJ(): Number of mallocs during 
MatSetValues() is 0

[0]  MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 150
[0]  MatCheckCompressedRow(): Found the ratio (num_zerorows 
0)/(num_localrows 1219) < 0.6. Do not use CompressedRow routines.
[0]  MatSeqAIJCheckInode(): Found 1160 nodes out of 1219 rows. Not 
using Inode routines
[0]  PetscCommDuplicate(): Using internal PETSc communicator 
94770066936960 94770087780768
[0]  PetscCommDuplicate(): Using internal PETSc communicator 
94770066936960 94770087780768
[0]  PetscCommDuplicate(): Using internal PETSc communicator 
94770066936960 94770087780768
[0]  PetscCommDuplicate(): Using internal PETSc communicator 
94770066936960 94770087780768
[0]  PetscCommDuplicate(): Using internal PETSc communicator 
94770066936960 94770087780768
[0]  PetscGetHostName(): Rejecting domainname, likely is NIS 
nico-santech.(none)

[0]  PCSetUp(): Setting up PC for first time
[0]  MatAssemblyEnd_SeqAIJ(): Matrix size: 615 X 615; storage 
space: 0 unneeded,9213 used
[0]  MatAssemblyEnd_SeqAIJ(): Number of mallocs during 
MatSetValues() is 0

[0]  

Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-03 Thread Nicolas Barnafi via petsc-users

There are a number of common errors:

   1) Your PC has a prefix

   2) You have not called KSPSetFromOptions() here

Can you send the -ksp_view output?


The PC at least has no prefix. I had to set ksp_rtol to 1 to get through 
the solution process, you will find both the petsc_rc and the ksp_view 
at the bottom of this message.


Options are indeed being set from the options file, but there must be 
something missing at a certain level. Thanks for looking into this.


Best

 petsc_rc file

-ksp_monitor
-ksp_type gmres
-ksp_view
-mat_type aij
-ksp_norm_type unpreconditioned
-ksp_atol 1e-14
-ksp_rtol 1
-pc_type fieldsplit
-pc_fieldsplit_type multiplicative

 ksp_view

KSP Object: 1 MPI process
  type: gmres
restart=500, using Classical (unmodified) Gram-Schmidt 
Orthogonalization with no iterative refinement

happy breakdown tolerance 1e-30
  maximum iterations=1, nonzero initial guess
  tolerances:  relative=1., absolute=1e-14, divergence=1.
  right preconditioning
  using UNPRECONDITIONED norm type for convergence test
PC Object: 1 MPI process
  type: fieldsplit
FieldSplit with MULTIPLICATIVE composition: total splits = 4
Solver info for each split is in the following KSP objects:
  Split number 0 Defined by IS
  KSP Object: (fieldsplit_0_) 1 MPI process
type: preonly
maximum iterations=1, initial guess is zero
tolerances:  relative=1e-05, absolute=1e-50, divergence=1.
left preconditioning
using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_0_) 1 MPI process
type: ilu
PC has not been set up so information may be incomplete
  out-of-place factorization
  0 levels of fill
  tolerance for zero pivot 2.22045e-14
  matrix ordering: natural
  matrix solver type: petsc
  matrix not yet factored; no additional information available
linear system matrix = precond matrix:
Mat Object: (fieldsplit_0_) 1 MPI process
  type: seqaij
  rows=615, cols=615
  total: nonzeros=9213, allocated nonzeros=9213
  total number of mallocs used during MatSetValues calls=0
not using I-node routines
  Split number 1 Defined by IS
  KSP Object: (fieldsplit_1_) 1 MPI process
type: preonly
maximum iterations=1, initial guess is zero
tolerances:  relative=1e-05, absolute=1e-50, divergence=1.
left preconditioning
using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_1_) 1 MPI process
type: ilu
PC has not been set up so information may be incomplete
  out-of-place factorization
  0 levels of fill
  tolerance for zero pivot 2.22045e-14
  matrix ordering: natural
  matrix solver type: petsc
  matrix not yet factored; no additional information available
linear system matrix = precond matrix:
Mat Object: (fieldsplit_1_) 1 MPI process
  type: seqaij
  rows=64, cols=64
  total: nonzeros=0, allocated nonzeros=0
  total number of mallocs used during MatSetValues calls=0
using I-node routines: found 13 nodes, limit used is 5
  Split number 2 Defined by IS
  KSP Object: (fieldsplit_2_) 1 MPI process
type: preonly
maximum iterations=1, initial guess is zero
tolerances:  relative=1e-05, absolute=1e-50, divergence=1.
left preconditioning
using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_2_) 1 MPI process
type: ilu
PC has not been set up so information may be incomplete
  out-of-place factorization
  0 levels of fill
  tolerance for zero pivot 2.22045e-14
  matrix ordering: natural
  matrix solver type: petsc
  matrix not yet factored; no additional information available
linear system matrix = precond matrix:
Mat Object: (fieldsplit_2_) 1 MPI process
  type: seqaij
  rows=240, cols=240
  total: nonzeros=2140, allocated nonzeros=2140
  total number of mallocs used during MatSetValues calls=0
not using I-node routines
  Split number 3 Defined by IS
  KSP Object: (fieldsplit_3_) 1 MPI process
type: preonly
maximum iterations=1, initial guess is zero
tolerances:  relative=1e-05, absolute=1e-50, divergence=1.
left preconditioning
using DEFAULT norm type for convergence test
  PC Object: (fieldsplit_3_) 1 MPI process
type: ilu
PC has not been set up so information may be incomplete
  out-of-place factorization
  0 levels of fill
  tolerance for zero pivot 2.22045e-14
  matrix ordering: natural
  matrix solver type: petsc
  matrix not yet factored; no additional information available
linear system matrix = precond matrix:
Mat Object: (fieldsplit_3_) 1 MPI process
  type: seqaij
  rows=300, cols=300
  total: nonzeros=2292, allocated nonzeros=2292
  total number of mallocs used during MatSetValues calls=0
not using I-node routines
  linear system matrix = precond matrix:
  Mat Object: 1 MPI process
type: seqaij
 

Re: [petsc-users] Problem setting Fieldsplit fields

2023-02-03 Thread Nicolas Barnafi via petsc-users

Thanks Matt,

Sorry, I copied the output from the error, but in the options file I do 
it as expected:


 -pc_fieldsplit_0_fields 0,1
 -pc_fieldsplit_1_fields 2,3

Best

On 03-02-23 16:50, Matthew Knepley wrote:
On Fri, Feb 3, 2023 at 2:46 PM Nicolas Barnafi via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:


Dear community,

I am using a fieldsplit preconditioner, but for some reason I cannot
group fields as in other libraries (i.e. I do this in Firedrake and it
works). Some context:

I have set four fields to the preconditioner, which I want to
regroup with
-pc_fieldsplit_0_fields value: 0,1
-pc_fieldsplit_1_fields value: 2,3


You should not have a colon, ":"

   Thanks,

      Matt

But apparently this doesn't get read for some reason. In fact, from
-ksp_view I still see all 4 fields, and it fails as one of the blocks
has a null diagonal (coming from a saddle point problem), so it gives
the following error. Interestingly, it shows that the groupings where
not considered:

[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry 0
[0]PETSC ERROR: WARNING! There are option(s) set that were not used!
Could be the program crashed before they were used or a spelling
mistake, etc!
[0]PETSC ERROR: Option left: name:-pc_fieldsplit_0_fields value: 0,1
[0]PETSC ERROR: Option left: name:-pc_fieldsplit_1_fields value: 2,3

Any clues to why this can happen? Best regards,
Nicolás Barnafi



--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>




[petsc-users] Problem setting Fieldsplit fields

2023-02-03 Thread Nicolas Barnafi via petsc-users

Dear community,

I am using a fieldsplit preconditioner, but for some reason I cannot 
group fields as in other libraries (i.e. I do this in Firedrake and it 
works). Some context:


I have set four fields to the preconditioner, which I want to regroup with
-pc_fieldsplit_0_fields value: 0,1
-pc_fieldsplit_1_fields value: 2,3

But apparently this doesn't get read for some reason. In fact, from 
-ksp_view I still see all 4 fields, and it fails as one of the blocks 
has a null diagonal (coming from a saddle point problem), so it gives 
the following error. Interestingly, it shows that the groupings where 
not considered:


[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Matrix is missing diagonal entry 0
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! 
Could be the program crashed before they were used or a spelling 
mistake, etc!

[0]PETSC ERROR: Option left: name:-pc_fieldsplit_0_fields value: 0,1
[0]PETSC ERROR: Option left: name:-pc_fieldsplit_1_fields value: 2,3

Any clues to why this can happen? Best regards,
Nicolás Barnafi



Re: [petsc-users] Fwd: SNES with Fieldsplit + MatNest in parallel

2021-11-09 Thread Nicolas Barnafi via petsc-users



I am not adept enough at FEniCS to look at the code and debug. However,
I would make a very small problem for 2 processes, say two cells on each,
and then print out is_0 and is_1. That should tell you if you have the 
right
unknowns in each block. If so, then it seems like something is not 
correct in
your assembly routine, which is probably better debugged on the FEniCS 
list.

However, if they are not right, we can help get them right.

  THanks,

    Matt
--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 



Thank Matthew, I had been testing the indexes and they looked fine. In fact:

Rank-0, is_0: [0 1 2] [633 634 635]
Rank-0, is_1 [636 637 638] [729 730 731]
Rank-1, is_0: [732 733 734] [1368 1369 1370]
Rank-1, is_1 [1371 1372 1373] [1464 1465 1466]

where is_1 should be much smaller, and it is indeed the case.

Thanks!
NB