Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Eric Chamberland via petsc-users

Hi everyone,

I'm joining this discussion because we're testing a simple way to 
simplify the current object naming and configuration system in PETSc.


The solution we're testing uses the same philosophy as the option 
database keys: a "default" database.


We're creating a "default" PetscObjectList that gives access to a global 
list of pairs: .


With this, we can simply "name" and add all our PetscObjects to it.

At run time, we create an option that holds the name of the PetscObject 
to be passed as a parameter to whatever we want to configure.


For example, let's say you want to pass an optional matrix named 
"preSchur" to a specific PC. It could be configured like this:


Mat MyMat;
PetscObjectSetName(MyMat, "preSchur");
PetscObjectListAdd(gGlobalList, "preSchur", (PetscObject)MyMat); /* Here 
MyMat is made available into the global object list with the key 
"preSchur" */


Then, in a PC, you can retrieve the matrix by using both the options 
database keys and the objects database. For example, if a PC looks for 
options like:


-pc_fieldsplit_schur_precondition user
-pc_fieldsplit_schur_precondition_user_mat preSchur

It just needs to look if the key 
"pc_fieldsplit_schur_precondition_user_mat" has been set, and if so, 
retrieve the value, which is the name of the matrix to be retrieved from 
the global list.


With this approach, all PCs, KSPs, and other objects can be given 
objects at setup time (or later) with proper naming of the options and 
objects.


Some key features of this approach include:

- If a user misspells an option, it will be visible in the unused options.
- Any new options to any PC or KSP do not need a new function to use 
them; just a new option key is needed.
- All software that uses PETSc as a library can have a very "light" 
interface to it: just name all objects and let users "configure" them!
- This is almost the same idea as "composing objects," which is already 
used in many places in PETSc. It's simpler to use and really relaxes the 
constraints on "when" you can configure a PC.


I think this is a promising approach to simplifying the object naming 
and configuration system in PETSc. I'm interested in hearing your feedback.


Thanks,

Eric


On 2023-07-13 13:44, Matthew Knepley wrote:
On Thu, Jul 13, 2023 at 5:33 AM Pierre Jolivet 
 wrote:


Dear Nicolas,


On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas
 wrote:

Dear Pierre,

You are absolutely right. I was using a --with-debugging=0 (aka
release) install and this is definitely an error.
Once I used my debug install, I found the way to fix my problem.
The solution is in the attached script: I first need to extract
the correct block from the PC operator's MatNest and then append
the null space to it.
Anyway this is a bit tricky...


Yep, it’s the same with all “nested” solvers, fieldsplit, ASM, MG,
you name it.
You first need the initial PCSetUp() so that the bare minimum is
put in place, then you have to fetch things yourself and adapt it
to your needs.
We had a similar discussion with the MEF++ people last week, there
is currently no way around this, AFAIK.


Actually, I hated this as well, so I built a way around it _if_ you 
are using a DM to define the problem. Then
you can set a "nullspace constructor" to make it if the field you are 
talking about is ever extracted. You use DMSetNullSpaceConstructor(). 
I do this in SNES ex62 and ex69, and other examples.


  Thanks,

     Matt

Thanks,
Pierre


Regards,
Nicolas


*De :*pierre.joli...@lip6.fr 
*Envoyé :*mercredi 12 juillet 2023 19:52
    *À :*TARDIEU Nicolas 
*Cc :*petsc-users@mcs.anl.gov 
*Objet :*Re: [petsc-users] Near null space for a fieldsplit in
petsc4py

> On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users
 wrote:
>
> Dear PETSc team,
>
> In the attached example, I set up a block pc for a saddle-point
problem in petsc4py. The IS define the unknowns, namely some
physical quantity (phys) and a Lagrange multiplier (lags).
> I would like to attach a near null space to the physical block,
in order to get the best performance from an AMG pc.
> I have been trying hard, attaching it to the initial block, to
the IS but no matter what I am doing, when it comes to
"ksp_view", no near null space is attached to the matrix.
>
> Could you please help me figure out what I am doing wrong ?

Are you using a double-precision 32-bit integers real build of PETSc?
Is it --with-debugging=0?
Because with my debug build, I get the following error (thus
explaining why it’s not attached to the KSP).
  

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Stefano Zampini
clearly, I meant optimized mode

Il giorno gio 13 lug 2023 alle ore 19:19 Stefano Zampini <
stefano.zamp...@gmail.com> ha scritto:

> In any case, we need to activate PetscCheck in debug mode too. This could
> have been avoided.
>
> Il giorno gio 13 lug 2023 alle ore 18:23 Karin 
> ha scritto:
>
>> Thank you very much for the information Matt.
>> Unfortunately, I do not use DM  :-(
>>
>> Le jeu. 13 juil. 2023 à 13:44, Matthew Knepley  a
>> écrit :
>>
>>> On Thu, Jul 13, 2023 at 5:33 AM Pierre Jolivet 
>>> wrote:
>>>
>>>> Dear Nicolas,
>>>>
>>>> On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas 
>>>> wrote:
>>>>
>>>> Dear Pierre,
>>>>
>>>> You are absolutely right. I was using a --with-debugging=0 (aka
>>>> release) install and this is definitely an error.
>>>> Once I used my debug install, I found the way to fix my problem. The
>>>> solution is in the attached script: I first need to extract the correct
>>>> block from the PC operator's MatNest and then append the null space to it.
>>>> Anyway this is a bit tricky...
>>>>
>>>>
>>>> Yep, it’s the same with all “nested” solvers, fieldsplit, ASM, MG, you
>>>> name it.
>>>> You first need the initial PCSetUp() so that the bare minimum is put in
>>>> place, then you have to fetch things yourself and adapt it to your needs.
>>>> We had a similar discussion with the MEF++ people last week, there is
>>>> currently no way around this, AFAIK.
>>>>
>>>
>>> Actually, I hated this as well, so I built a way around it _if_ you are
>>> using a DM to define the problem. Then
>>> you can set a "nullspace constructor" to make it if the field you are
>>> talking about is ever extracted. You use DMSetNullSpaceConstructor(). I do
>>> this in SNES ex62 and ex69, and other examples.
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>>
>>>> Thanks,
>>>> Pierre
>>>>
>>>> Regards,
>>>> Nicolas
>>>>
>>>> --
>>>> *De :* pierre.joli...@lip6.fr 
>>>> *Envoyé :* mercredi 12 juillet 2023 19:52
>>>> *À :* TARDIEU Nicolas 
>>>> *Cc :* petsc-users@mcs.anl.gov 
>>>> *Objet :* Re: [petsc-users] Near null space for a fieldsplit in
>>>> petsc4py
>>>>
>>>>
>>>> > On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users <
>>>> petsc-users@mcs.anl.gov> wrote:
>>>> >
>>>> > Dear PETSc team,
>>>> >
>>>> > In the attached example, I set up a block pc for a saddle-point
>>>> problem in petsc4py. The IS define the unknowns, namely some physical
>>>> quantity (phys) and a Lagrange multiplier (lags).
>>>> > I would like to attach a near null space to the physical block, in
>>>> order to get the best performance from an AMG pc.
>>>> > I have been trying hard, attaching it to the initial block, to the IS
>>>> but no matter what I am doing, when it comes to "ksp_view", no near null
>>>> space is attached to the matrix.
>>>> >
>>>> > Could you please help me figure out what I am doing wrong ?
>>>>
>>>> Are you using a double-precision 32-bit integers real build of PETSc?
>>>> Is it --with-debugging=0?
>>>> Because with my debug build, I get the following error (thus explaining
>>>> why it’s not attached to the KSP).
>>>> Traceback (most recent call last):
>>>>   File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in
>>>> 
>>>> ns = NullSpace().create(True, [v], comm=comm)
>>>>  
>>>>   File "petsc4py/PETSc/Mat.pyx", line 5611, in
>>>> petsc4py.PETSc.NullSpace.create
>>>> petsc4py.PETSc.Error: error code 62
>>>> [0] MatNullSpaceCreate() at
>>>> /Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
>>>> [0] Invalid argument
>>>> [0] Vector 0 must have 2-norm of 1.0, it is 22.3159
>>>>
>>>> Furthermore, if you set yourself the constant vector in the near
>>>> null-space, then the first argument of create() must be False, otherwise,
>>>> you’ll have twice the same vector, and you’ll end up wit

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Stefano Zampini
In any case, we need to activate PetscCheck in debug mode too. This could
have been avoided.

Il giorno gio 13 lug 2023 alle ore 18:23 Karin 
ha scritto:

> Thank you very much for the information Matt.
> Unfortunately, I do not use DM  :-(
>
> Le jeu. 13 juil. 2023 à 13:44, Matthew Knepley  a
> écrit :
>
>> On Thu, Jul 13, 2023 at 5:33 AM Pierre Jolivet 
>> wrote:
>>
>>> Dear Nicolas,
>>>
>>> On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas 
>>> wrote:
>>>
>>> Dear Pierre,
>>>
>>> You are absolutely right. I was using a --with-debugging=0 (aka release)
>>> install and this is definitely an error.
>>> Once I used my debug install, I found the way to fix my problem. The
>>> solution is in the attached script: I first need to extract the correct
>>> block from the PC operator's MatNest and then append the null space to it.
>>> Anyway this is a bit tricky...
>>>
>>>
>>> Yep, it’s the same with all “nested” solvers, fieldsplit, ASM, MG, you
>>> name it.
>>> You first need the initial PCSetUp() so that the bare minimum is put in
>>> place, then you have to fetch things yourself and adapt it to your needs.
>>> We had a similar discussion with the MEF++ people last week, there is
>>> currently no way around this, AFAIK.
>>>
>>
>> Actually, I hated this as well, so I built a way around it _if_ you are
>> using a DM to define the problem. Then
>> you can set a "nullspace constructor" to make it if the field you are
>> talking about is ever extracted. You use DMSetNullSpaceConstructor(). I do
>> this in SNES ex62 and ex69, and other examples.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Thanks,
>>> Pierre
>>>
>>> Regards,
>>> Nicolas
>>>
>>> --
>>> *De :* pierre.joli...@lip6.fr 
>>> *Envoyé :* mercredi 12 juillet 2023 19:52
>>> *À :* TARDIEU Nicolas 
>>> *Cc :* petsc-users@mcs.anl.gov 
>>> *Objet :* Re: [petsc-users] Near null space for a fieldsplit in petsc4py
>>>
>>>
>>> > On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users <
>>> petsc-users@mcs.anl.gov> wrote:
>>> >
>>> > Dear PETSc team,
>>> >
>>> > In the attached example, I set up a block pc for a saddle-point
>>> problem in petsc4py. The IS define the unknowns, namely some physical
>>> quantity (phys) and a Lagrange multiplier (lags).
>>> > I would like to attach a near null space to the physical block, in
>>> order to get the best performance from an AMG pc.
>>> > I have been trying hard, attaching it to the initial block, to the IS
>>> but no matter what I am doing, when it comes to "ksp_view", no near null
>>> space is attached to the matrix.
>>> >
>>> > Could you please help me figure out what I am doing wrong ?
>>>
>>> Are you using a double-precision 32-bit integers real build of PETSc?
>>> Is it --with-debugging=0?
>>> Because with my debug build, I get the following error (thus explaining
>>> why it’s not attached to the KSP).
>>> Traceback (most recent call last):
>>>   File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in
>>> 
>>> ns = NullSpace().create(True, [v], comm=comm)
>>>  
>>>   File "petsc4py/PETSc/Mat.pyx", line 5611, in
>>> petsc4py.PETSc.NullSpace.create
>>> petsc4py.PETSc.Error: error code 62
>>> [0] MatNullSpaceCreate() at
>>> /Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
>>> [0] Invalid argument
>>> [0] Vector 0 must have 2-norm of 1.0, it is 22.3159
>>>
>>> Furthermore, if you set yourself the constant vector in the near
>>> null-space, then the first argument of create() must be False, otherwise,
>>> you’ll have twice the same vector, and you’ll end up with another error
>>> (the vectors in the near null-space must be orthonormal).
>>> If things still don’t work after those couple of fixes, please feel free
>>> to send an up-to-date reproducer.
>>>
>>> Thanks,
>>> Pierre
>>>
>>> > Thanks,
>>> > Nicolas
>>> >
>>> >
>>> >
>>> >
>>> > Ce message et toutes les pièces jointes (ci-après le 'Message') sont
>>> établis à l'intention exclusive des destinata

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Karin
Thank you very much for the information Matt.
Unfortunately, I do not use DM  :-(

Le jeu. 13 juil. 2023 à 13:44, Matthew Knepley  a écrit :

> On Thu, Jul 13, 2023 at 5:33 AM Pierre Jolivet 
> wrote:
>
>> Dear Nicolas,
>>
>> On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas 
>> wrote:
>>
>> Dear Pierre,
>>
>> You are absolutely right. I was using a --with-debugging=0 (aka release)
>> install and this is definitely an error.
>> Once I used my debug install, I found the way to fix my problem. The
>> solution is in the attached script: I first need to extract the correct
>> block from the PC operator's MatNest and then append the null space to it.
>> Anyway this is a bit tricky...
>>
>>
>> Yep, it’s the same with all “nested” solvers, fieldsplit, ASM, MG, you
>> name it.
>> You first need the initial PCSetUp() so that the bare minimum is put in
>> place, then you have to fetch things yourself and adapt it to your needs.
>> We had a similar discussion with the MEF++ people last week, there is
>> currently no way around this, AFAIK.
>>
>
> Actually, I hated this as well, so I built a way around it _if_ you are
> using a DM to define the problem. Then
> you can set a "nullspace constructor" to make it if the field you are
> talking about is ever extracted. You use DMSetNullSpaceConstructor(). I do
> this in SNES ex62 and ex69, and other examples.
>
>   Thanks,
>
>  Matt
>
>
>> Thanks,
>> Pierre
>>
>> Regards,
>> Nicolas
>>
>> --
>> *De :* pierre.joli...@lip6.fr 
>> *Envoyé :* mercredi 12 juillet 2023 19:52
>> *À :* TARDIEU Nicolas 
>> *Cc :* petsc-users@mcs.anl.gov 
>> *Objet :* Re: [petsc-users] Near null space for a fieldsplit in petsc4py
>>
>>
>> > On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users <
>> petsc-users@mcs.anl.gov> wrote:
>> >
>> > Dear PETSc team,
>> >
>> > In the attached example, I set up a block pc for a saddle-point problem
>> in petsc4py. The IS define the unknowns, namely some physical quantity
>> (phys) and a Lagrange multiplier (lags).
>> > I would like to attach a near null space to the physical block, in
>> order to get the best performance from an AMG pc.
>> > I have been trying hard, attaching it to the initial block, to the IS
>> but no matter what I am doing, when it comes to "ksp_view", no near null
>> space is attached to the matrix.
>> >
>> > Could you please help me figure out what I am doing wrong ?
>>
>> Are you using a double-precision 32-bit integers real build of PETSc?
>> Is it --with-debugging=0?
>> Because with my debug build, I get the following error (thus explaining
>> why it’s not attached to the KSP).
>> Traceback (most recent call last):
>>   File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in
>> 
>> ns = NullSpace().create(True, [v], comm=comm)
>>  
>>   File "petsc4py/PETSc/Mat.pyx", line 5611, in
>> petsc4py.PETSc.NullSpace.create
>> petsc4py.PETSc.Error: error code 62
>> [0] MatNullSpaceCreate() at
>> /Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
>> [0] Invalid argument
>> [0] Vector 0 must have 2-norm of 1.0, it is 22.3159
>>
>> Furthermore, if you set yourself the constant vector in the near
>> null-space, then the first argument of create() must be False, otherwise,
>> you’ll have twice the same vector, and you’ll end up with another error
>> (the vectors in the near null-space must be orthonormal).
>> If things still don’t work after those couple of fixes, please feel free
>> to send an up-to-date reproducer.
>>
>> Thanks,
>> Pierre
>>
>> > Thanks,
>> > Nicolas
>> >
>> >
>> >
>> >
>> > Ce message et toutes les pièces jointes (ci-après le 'Message') sont
>> établis à l'intention exclusive des destinataires et les informations qui y
>> figurent sont strictement confidentielles. Toute utilisation de ce Message
>> non conforme à sa destination, toute diffusion ou toute publication totale
>> ou partielle, est interdite sauf autorisation expresse.
>> >
>> > Si vous n'êtes pas le destinataire de ce Message, il vous est interdit
>> de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou
>> partie. Si vous avez reçu ce Message par erreur, merci de le supprimer de
>> votre système, ainsi que toutes ses copies, et d

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Matthew Knepley
On Thu, Jul 13, 2023 at 5:33 AM Pierre Jolivet 
wrote:

> Dear Nicolas,
>
> On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas 
> wrote:
>
> Dear Pierre,
>
> You are absolutely right. I was using a --with-debugging=0 (aka release)
> install and this is definitely an error.
> Once I used my debug install, I found the way to fix my problem. The
> solution is in the attached script: I first need to extract the correct
> block from the PC operator's MatNest and then append the null space to it.
> Anyway this is a bit tricky...
>
>
> Yep, it’s the same with all “nested” solvers, fieldsplit, ASM, MG, you
> name it.
> You first need the initial PCSetUp() so that the bare minimum is put in
> place, then you have to fetch things yourself and adapt it to your needs.
> We had a similar discussion with the MEF++ people last week, there is
> currently no way around this, AFAIK.
>

Actually, I hated this as well, so I built a way around it _if_ you are
using a DM to define the problem. Then
you can set a "nullspace constructor" to make it if the field you are
talking about is ever extracted. You use DMSetNullSpaceConstructor(). I do
this in SNES ex62 and ex69, and other examples.

  Thanks,

 Matt


> Thanks,
> Pierre
>
> Regards,
> Nicolas
>
> --
> *De :* pierre.joli...@lip6.fr 
> *Envoyé :* mercredi 12 juillet 2023 19:52
> *À :* TARDIEU Nicolas 
> *Cc :* petsc-users@mcs.anl.gov 
> *Objet :* Re: [petsc-users] Near null space for a fieldsplit in petsc4py
>
>
> > On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
> >
> > Dear PETSc team,
> >
> > In the attached example, I set up a block pc for a saddle-point problem
> in petsc4py. The IS define the unknowns, namely some physical quantity
> (phys) and a Lagrange multiplier (lags).
> > I would like to attach a near null space to the physical block, in order
> to get the best performance from an AMG pc.
> > I have been trying hard, attaching it to the initial block, to the IS
> but no matter what I am doing, when it comes to "ksp_view", no near null
> space is attached to the matrix.
> >
> > Could you please help me figure out what I am doing wrong ?
>
> Are you using a double-precision 32-bit integers real build of PETSc?
> Is it --with-debugging=0?
> Because with my debug build, I get the following error (thus explaining
> why it’s not attached to the KSP).
> Traceback (most recent call last):
>   File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in
> 
> ns = NullSpace().create(True, [v], comm=comm)
>  
>   File "petsc4py/PETSc/Mat.pyx", line 5611, in
> petsc4py.PETSc.NullSpace.create
> petsc4py.PETSc.Error: error code 62
> [0] MatNullSpaceCreate() at
> /Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
> [0] Invalid argument
> [0] Vector 0 must have 2-norm of 1.0, it is 22.3159
>
> Furthermore, if you set yourself the constant vector in the near
> null-space, then the first argument of create() must be False, otherwise,
> you’ll have twice the same vector, and you’ll end up with another error
> (the vectors in the near null-space must be orthonormal).
> If things still don’t work after those couple of fixes, please feel free
> to send an up-to-date reproducer.
>
> Thanks,
> Pierre
>
> > Thanks,
> > Nicolas
> >
> >
> >
> >
> > Ce message et toutes les pièces jointes (ci-après le 'Message') sont
> établis à l'intention exclusive des destinataires et les informations qui y
> figurent sont strictement confidentielles. Toute utilisation de ce Message
> non conforme à sa destination, toute diffusion ou toute publication totale
> ou partielle, est interdite sauf autorisation expresse.
> >
> > Si vous n'êtes pas le destinataire de ce Message, il vous est interdit
> de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou
> partie. Si vous avez reçu ce Message par erreur, merci de le supprimer de
> votre système, ainsi que toutes ses copies, et de n'en garder aucune trace
> sur quelque support que ce soit. Nous vous remercions également d'en
> avertir immédiatement l'expéditeur par retour du message.
> >
> > Il est impossible de garantir que les communications par messagerie
> électronique arrivent en temps utile, sont sécurisées ou dénuées de toute
> erreur ou virus.
> > 
> >
> > This message and any attachments (the 'Message') are intended solely for
> the addressees. The information contained in this Message is confidential.
> Any us

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Pierre Jolivet
Dear Nicolas,

> On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas  wrote:
> 
> Dear Pierre,
> 
> You are absolutely right. I was using a --with-debugging=0 (aka release) 
> install and this is definitely an error.
> Once I used my debug install, I found the way to fix my problem. The solution 
> is in the attached script: I first need to extract the correct block from the 
> PC operator's MatNest and then append the null space to it.
> Anyway this is a bit tricky...

Yep, it’s the same with all “nested” solvers, fieldsplit, ASM, MG, you name it.
You first need the initial PCSetUp() so that the bare minimum is put in place, 
then you have to fetch things yourself and adapt it to your needs.
We had a similar discussion with the MEF++ people last week, there is currently 
no way around this, AFAIK.

Thanks,
Pierre

> Regards, 
> Nicolas
> 
> De : pierre.joli...@lip6.fr 
> Envoyé : mercredi 12 juillet 2023 19:52
> À : TARDIEU Nicolas 
> Cc : petsc-users@mcs.anl.gov 
> Objet : Re: [petsc-users] Near null space for a fieldsplit in petsc4py
>  
> 
> > On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users 
> >  wrote:
> > 
> > Dear PETSc team,
> > 
> > In the attached example, I set up a block pc for a saddle-point problem in 
> > petsc4py. The IS define the unknowns, namely some physical quantity (phys) 
> > and a Lagrange multiplier (lags).
> > I would like to attach a near null space to the physical block, in order to 
> > get the best performance from an AMG pc. 
> > I have been trying hard, attaching it to the initial block, to the IS but 
> > no matter what I am doing, when it comes to "ksp_view", no near null space 
> > is attached to the matrix.
> > 
> > Could you please help me figure out what I am doing wrong ?
> 
> Are you using a double-precision 32-bit integers real build of PETSc?
> Is it --with-debugging=0?
> Because with my debug build, I get the following error (thus explaining why 
> it’s not attached to the KSP).
> Traceback (most recent call last):
>   File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in 
> ns = NullSpace().create(True, [v], comm=comm)
>  
>   File "petsc4py/PETSc/Mat.pyx", line 5611, in petsc4py.PETSc.NullSpace.create
> petsc4py.PETSc.Error: error code 62
> [0] MatNullSpaceCreate() at 
> /Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
> [0] Invalid argument
> [0] Vector 0 must have 2-norm of 1.0, it is 22.3159
> 
> Furthermore, if you set yourself the constant vector in the near null-space, 
> then the first argument of create() must be False, otherwise, you’ll have 
> twice the same vector, and you’ll end up with another error (the vectors in 
> the near null-space must be orthonormal).
> If things still don’t work after those couple of fixes, please feel free to 
> send an up-to-date reproducer.
> 
> Thanks,
> Pierre
> 
> > Thanks,
> > Nicolas
> > 
> > 
> > 
> > 
> > Ce message et toutes les pièces jointes (ci-après le 'Message') sont 
> > établis à l'intention exclusive des destinataires et les informations qui y 
> > figurent sont strictement confidentielles. Toute utilisation de ce Message 
> > non conforme à sa destination, toute diffusion ou toute publication totale 
> > ou partielle, est interdite sauf autorisation expresse.
> > 
> > Si vous n'êtes pas le destinataire de ce Message, il vous est interdit de 
> > le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou 
> > partie. Si vous avez reçu ce Message par erreur, merci de le supprimer de 
> > votre système, ainsi que toutes ses copies, et de n'en garder aucune trace 
> > sur quelque support que ce soit. Nous vous remercions également d'en 
> > avertir immédiatement l'expéditeur par retour du message.
> > 
> > Il est impossible de garantir que les communications par messagerie 
> > électronique arrivent en temps utile, sont sécurisées ou dénuées de toute 
> > erreur ou virus.
> > 
> > 
> > This message and any attachments (the 'Message') are intended solely for 
> > the addressees. The information contained in this Message is confidential. 
> > Any use of information contained in this Message not in accord with its 
> > purpose, any dissemination or disclosure, either whole or partial, is 
> > prohibited except formal approval.
> > 
> > If you are not the addressee, you may not copy, forward, disclose or use 
> > any part of it. If you have received this message in error, please delete 
> > it and all copies from 

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread TARDIEU Nicolas via petsc-users
Dear Pierre,

You are absolutely right. I was using a --with-debugging=0 (aka release) 
install and this is definitely an error.
Once I used my debug install, I found the way to fix my problem. The solution 
is in the attached script: I first need to extract the correct block from the 
PC operator's MatNest and then append the null space to it.
Anyway this is a bit tricky...

Regards,
Nicolas


De : pierre.joli...@lip6.fr 
Envoyé : mercredi 12 juillet 2023 19:52
À : TARDIEU Nicolas 
Cc : petsc-users@mcs.anl.gov 
Objet : Re: [petsc-users] Near null space for a fieldsplit in petsc4py


> On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users 
>  wrote:
>
> Dear PETSc team,
>
> In the attached example, I set up a block pc for a saddle-point problem in 
> petsc4py. The IS define the unknowns, namely some physical quantity (phys) 
> and a Lagrange multiplier (lags).
> I would like to attach a near null space to the physical block, in order to 
> get the best performance from an AMG pc.
> I have been trying hard, attaching it to the initial block, to the IS but no 
> matter what I am doing, when it comes to "ksp_view", no near null space is 
> attached to the matrix.
>
> Could you please help me figure out what I am doing wrong ?

Are you using a double-precision 32-bit integers real build of PETSc?
Is it --with-debugging=0?
Because with my debug build, I get the following error (thus explaining why 
it’s not attached to the KSP).
Traceback (most recent call last):
  File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in 
ns = NullSpace().create(True, [v], comm=comm)
 
  File "petsc4py/PETSc/Mat.pyx", line 5611, in petsc4py.PETSc.NullSpace.create
petsc4py.PETSc.Error: error code 62
[0] MatNullSpaceCreate() at 
/Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
[0] Invalid argument
[0] Vector 0 must have 2-norm of 1.0, it is 22.3159

Furthermore, if you set yourself the constant vector in the near null-space, 
then the first argument of create() must be False, otherwise, you’ll have twice 
the same vector, and you’ll end up with another error (the vectors in the near 
null-space must be orthonormal).
If things still don’t work after those couple of fixes, please feel free to 
send an up-to-date reproducer.

Thanks,
Pierre

> Thanks,
> Nicolas
>
>
>
>
> Ce message et toutes les pièces jointes (ci-après le 'Message') sont établis 
> à l'intention exclusive des destinataires et les informations qui y figurent 
> sont strictement confidentielles. Toute utilisation de ce Message non 
> conforme à sa destination, toute diffusion ou toute publication totale ou 
> partielle, est interdite sauf autorisation expresse.
>
> Si vous n'êtes pas le destinataire de ce Message, il vous est interdit de le 
> copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. 
> Si vous avez reçu ce Message par erreur, merci de le supprimer de votre 
> système, ainsi que toutes ses copies, et de n'en garder aucune trace sur 
> quelque support que ce soit. Nous vous remercions également d'en avertir 
> immédiatement l'expéditeur par retour du message.
>
> Il est impossible de garantir que les communications par messagerie 
> électronique arrivent en temps utile, sont sécurisées ou dénuées de toute 
> erreur ou virus.
> 
>
> This message and any attachments (the 'Message') are intended solely for the 
> addressees. The information contained in this Message is confidential. Any 
> use of information contained in this Message not in accord with its purpose, 
> any dissemination or disclosure, either whole or partial, is prohibited 
> except formal approval.
>
> If you are not the addressee, you may not copy, forward, disclose or use any 
> part of it. If you have received this message in error, please delete it and 
> all copies from your system and notify the sender immediately by return 
> message.
>
> E-mail communication cannot be guaranteed to be timely secure, error or 
> virus-free.
> 




Ce message et toutes les pièces jointes (ci-après le 'Message') sont établis à 
l'intention exclusive des destinataires et les informations qui y figurent sont 
strictement confidentielles. Toute utilisation de ce Message non conforme à sa 
destination, toute diffusion ou toute publication totale ou partielle, est 
interdite sauf autorisation expresse.

Si vous n'êtes pas le destinataire de ce Message, il vous est interdit de le 
copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si 
vous avez reçu ce Message par erreur, merci de le supprimer de votre système, 
ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support 
que ce soit. 

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-12 Thread Pierre Jolivet


> On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users 
>  wrote:
> 
> Dear PETSc team,
> 
> In the attached example, I set up a block pc for a saddle-point problem in 
> petsc4py. The IS define the unknowns, namely some physical quantity (phys) 
> and a Lagrange multiplier (lags).
> I would like to attach a near null space to the physical block, in order to 
> get the best performance from an AMG pc. 
> I have been trying hard, attaching it to the initial block, to the IS but no 
> matter what I am doing, when it comes to "ksp_view", no near null space is 
> attached to the matrix.
> 
> Could you please help me figure out what I am doing wrong ?

Are you using a double-precision 32-bit integers real build of PETSc?
Is it --with-debugging=0?
Because with my debug build, I get the following error (thus explaining why 
it’s not attached to the KSP).
Traceback (most recent call last):
  File "/Volumes/Data/Downloads/test/test_NullSpace.py", line 35, in 
ns = NullSpace().create(True, [v], comm=comm)
 
  File "petsc4py/PETSc/Mat.pyx", line 5611, in petsc4py.PETSc.NullSpace.create
petsc4py.PETSc.Error: error code 62
[0] MatNullSpaceCreate() at 
/Volumes/Data/repositories/petsc/src/mat/interface/matnull.c:249
[0] Invalid argument
[0] Vector 0 must have 2-norm of 1.0, it is 22.3159

Furthermore, if you set yourself the constant vector in the near null-space, 
then the first argument of create() must be False, otherwise, you’ll have twice 
the same vector, and you’ll end up with another error (the vectors in the near 
null-space must be orthonormal).
If things still don’t work after those couple of fixes, please feel free to 
send an up-to-date reproducer.

Thanks,
Pierre

> Thanks,
> Nicolas
> 
> 
> 
> 
> Ce message et toutes les pièces jointes (ci-après le 'Message') sont établis 
> à l'intention exclusive des destinataires et les informations qui y figurent 
> sont strictement confidentielles. Toute utilisation de ce Message non 
> conforme à sa destination, toute diffusion ou toute publication totale ou 
> partielle, est interdite sauf autorisation expresse.
> 
> Si vous n'êtes pas le destinataire de ce Message, il vous est interdit de le 
> copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. 
> Si vous avez reçu ce Message par erreur, merci de le supprimer de votre 
> système, ainsi que toutes ses copies, et de n'en garder aucune trace sur 
> quelque support que ce soit. Nous vous remercions également d'en avertir 
> immédiatement l'expéditeur par retour du message.
> 
> Il est impossible de garantir que les communications par messagerie 
> électronique arrivent en temps utile, sont sécurisées ou dénuées de toute 
> erreur ou virus.
> 
> 
> This message and any attachments (the 'Message') are intended solely for the 
> addressees. The information contained in this Message is confidential. Any 
> use of information contained in this Message not in accord with its purpose, 
> any dissemination or disclosure, either whole or partial, is prohibited 
> except formal approval.
> 
> If you are not the addressee, you may not copy, forward, disclose or use any 
> part of it. If you have received this message in error, please delete it and 
> all copies from your system and notify the sender immediately by return 
> message.
> 
> E-mail communication cannot be guaranteed to be timely secure, error or 
> virus-free.
> 



[petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-12 Thread TARDIEU Nicolas via petsc-users
Dear PETSc team,

In the attached example, I set up a block pc for a saddle-point problem in 
petsc4py. The IS define the unknowns, namely some physical quantity (phys) and 
a Lagrange multiplier (lags).
I would like to attach a near null space to the physical block, in order to get 
the best performance from an AMG pc. 
I have been trying hard, attaching it to the initial block, to the IS but no 
matter what I am doing, when it comes to "ksp_view", no near null space is 
attached to the matrix.

Could you please help me figure out what I am doing wrong ?

Thanks,
Nicolas




Ce message et toutes les pièces jointes (ci-après le 'Message') sont établis à 
l'intention exclusive des destinataires et les informations qui y figurent sont 
strictement confidentielles. Toute utilisation de ce Message non conforme à sa 
destination, toute diffusion ou toute publication totale ou partielle, est 
interdite sauf autorisation expresse.

Si vous n'êtes pas le destinataire de ce Message, il vous est interdit de le 
copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si 
vous avez reçu ce Message par erreur, merci de le supprimer de votre système, 
ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support 
que ce soit. Nous vous remercions également d'en avertir immédiatement 
l'expéditeur par retour du message.

Il est impossible de garantir que les communications par messagerie 
électronique arrivent en temps utile, sont sécurisées ou dénuées de toute 
erreur ou virus.


This message and any attachments (the 'Message') are intended solely for the 
addressees. The information contained in this Message is confidential. Any use 
of information contained in this Message not in accord with its purpose, any 
dissemination or disclosure, either whole or partial, is prohibited except 
formal approval.

If you are not the addressee, you may not copy, forward, disclose or use any 
part of it. If you have received this message in error, please delete it and 
all copies from your system and notify the sender immediately by return message.

E-mail communication cannot be guaranteed to be timely secure, error or 
virus-free.


test.tgz
Description: test.tgz