Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-19 Thread Edoardo alinovi
Please ignore me, I was just making a mistake with the number of zeros, with JEd's suggestion to use MatXAIJSetPreallocation I can do a very bespoke code and everything looks good. I'll test the field splitting a bit to see if I can find some performance! Cheers

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-19 Thread Edoardo alinovi
Morning Guys, The good news is that finally fieldsplit is working ok! I have just a simple question about the interaction of MatMPIAIJSetPreallocation with MatSetValuesBlocked. In this simple example I have a 27x27 matrix (9 blocks, each composed by a 3x3 matrix). The only entry I have on each

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-15 Thread Edoardo alinovi
Thanks, I'll do it then :) Il Mar 15 Nov 2022, 19:25 Jed Brown ha scritto: > You do if preconditioners (like AMG) will use it or if using functions > like MatSetValuesBlocked(). If you have uniform block structure, it doesn't > hurt. > > Edoardo alinovi writes: > > > Hi Guys, > > > > Very

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-15 Thread Jed Brown
You do if preconditioners (like AMG) will use it or if using functions like MatSetValuesBlocked(). If you have uniform block structure, it doesn't hurt. Edoardo alinovi writes: > Hi Guys, > > Very quick one. Do I need to set the block size with MPIAIJ?

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-15 Thread Edoardo alinovi
Hi Guys, Very quick one. Do I need to set the block size with MPIAIJ?

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-14 Thread Edoardo alinovi
Thanks Barry, Your help is always much appreciated! I'll try this out asap. I have ended using baij because I have read the section "solving block matrices" and I was thinking that baij was the only way to use fieldsplit! Completely misunderstood then! Il Lun 14 Nov 2022, 20:34 Barry Smith ha

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-14 Thread Barry Smith
Can you clarify what you mean? For some classes of problems, PCFIELDSPLIT can be a very efficacious preconditioner; for example when certain fields have very different mathematical structure than others. In those cases it is worth using AIJ and PCFIELDSPLIT instead of keeping BAIJ. > On

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-14 Thread Barry Smith
Very sorry for wasting so much of your time. The PCFIELDSPLIT generally will not work with BAIJ matrices because the MatCreateSubMatrix() for BAIJ requires indexing by block in the matrix. Your code should work if you use MPIAIJ matrices. Note you can still use MatSetValuesBlocked() with

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-11 Thread Edoardo alinovi
Hi guys, anyone with any suggestion to make this thing work? 梁 Il Ven 11 Nov 2022, 10:24 Edoardo alinovi ha scritto: > Hi Barry, > > FYI, in test.F90 I noted that "ui" starts for 1 and not from 0. I fixed it > but the situation does not change much. I attached the new file in this > email. >

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Edoardo alinovi
Ah I see you have already added the missing interfaces for fortran enthusiasts :) So you likely do not need Matt's hack! [image: image.png]

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Edoardo alinovi
True, Maybe somebody merged it already? I have attached my silly example. To compile: mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include Do you need the petsc code MAtt did? test.F90 Description: Binary data

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Barry Smith
Can you share the code that produces the problem below? > On Nov 10, 2022, at 3:52 PM, Edoardo alinovi > wrote: > > The thing is, even I pass the following options: > > -UPeqn_pc_type fieldsplit -UPeqn_pc_fieldsplit_0_fields 0,1 > -UPeqn_pc_fieldsplit_1_fields 2

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Barry Smith
Hmm, that branch does not appear to exist. > On Nov 10, 2022, at 3:48 PM, Edoardo alinovi > wrote: > > I am sorry Barry, > > I told you it works, but it is not. I changed to index to integer, but I am > still getting this: > > [0]PETSC ERROR: - Error Message >

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Edoardo alinovi
The thing is, even I pass the following options: -UPeqn_pc_type fieldsplit -UPeqn_pc_fieldsplit_0_fields 0,1 -UPeqn_pc_fieldsplit_1_fields 2 -UPeqn_pc_fieldsplit_type SCHUR -UPeqn_pc_fieldsplit_block_size 3 I am getting the same issue, so there must be something fundamental in the way I am

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Edoardo alinovi
I am sorry Barry, I told you it works, but it is not. I changed to index to integer, but I am still getting this: [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: Nonconforming object sizes [0]PETSC ERROR: Local

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Edoardo alinovi
Hi Barry, Thanks a lot for getting back to me, I am quite stuck at the moment! Matt kindly added them in a dev branch I am using right now to test this pc. You are right I am declaring them badly, I am an idiot! My small test works now, but I'm still in trouble with the main code unfortunately.

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Barry Smith
These beasts should be PetscInt, not real real :: ufields(2), pfields(1) Side note. We do not recommend using options like -fdefault-real-8 because the compiler may change values in surprising ways. You can use PetscReal to declare real numbers and this will automatically match with

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-10 Thread Edoardo alinovi
Hello, I have tried a different way to create the splitting: ui(1) = 0 ui(2) = 1 pi(1) = 2 call ISCreateGeneral(PETSC_COMM_WORLD, 2, ui, PETSC_COPY_VALUES, isu, ierr) call ISCreateGeneral(PETSC_COMM_WORLD, 1, pi, PETSC_COPY_VALUES, isp, ierr) call

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
Hi Matt, it took a bit more than 1s, but I can reproduce the error in the attached file. To compile: *mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include* Please run it in serial as I have hardcoded some dimensions

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
The fact it is telling me 6 instead of 9, makes me think it is getting just the first split for "u" and not the second one for "p" that would lead to 9.

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Matthew Knepley
That would be fine. Thanks, Matt On Wed, Nov 9, 2022 at 9:19 AM Edoardo alinovi wrote: > I am copying this example: > https://petsc.org/release/src/ksp/ksp/tutorials/ex42.c.html > lines > 2040'2042 > -- What most

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Matthew Knepley
On Wed, Nov 9, 2022 at 9:17 AM Edoardo alinovi wrote: > Thanks, > > the stuff I am doing is within my code, so I am not sure you can reproduce > it. > How about just making a small code that fills those nonzeros with 1s. We just want to figure out why your sparsity pattern is not working. We

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
I am copying this example: https://petsc.org/release/src/ksp/ksp/tutorials/ex42.c.html lines 2040'2042

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
Thanks, the stuff I am doing is within my code, so I am not sure you can reproduce it. I am just doing this: call PCSetType(mypc, PCFIELDSPLIT, ierr) call PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) ufields(1) = 0 ufields(2) = 1

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Matthew Knepley
On Wed, Nov 9, 2022 at 9:05 AM Edoardo alinovi wrote: > So my cavity has 3x3=9 cells, each cells as a 3x3 block. I get the same > error: [0]PETSC ERROR: Local column sizes 6 do not add up to total number > of columns 9 > > However I do not define any IS, I just pass an array > to

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
So my cavity has 3x3=9 cells, each cells as a 3x3 block. I get the same error: [0]PETSC ERROR: Local column sizes 6 do not add up to total number of columns 9 However I do not define any IS, I just pass an array to PCFieldSplitSetFields() and thus I do not know how to plot them...

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Matthew Knepley
On Wed, Nov 9, 2022 at 8:09 AM Edoardo alinovi wrote: > Sure, > > I'll try on a 3x3 cavity. How can I print the ISs? > ISView() or PetscObjectViewFromOptions() Thanks, Matt > Il Mer 9 Nov 2022, 14:07 Matthew Knepley ha scritto: > >> On Wed, Nov 9, 2022 at 7:57 AM Edoardo alinovi >>

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
Sure, I'll try on a 3x3 cavity. How can I print the ISs? Il Mer 9 Nov 2022, 14:07 Matthew Knepley ha scritto: > On Wed, Nov 9, 2022 at 7:57 AM Edoardo alinovi > wrote: > >> To be clear, >> >> You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? >> > > I think you are right. Those

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Matthew Knepley
On Wed, Nov 9, 2022 at 7:57 AM Edoardo alinovi wrote: > To be clear, > > You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? > I think you are right. Those should start from 1. However, your ISes do not seem to cover the whole matrix. Can you start with a very small problem so that

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
To be clear, You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? Il Mer 9 Nov 2022, 13:54 Edoardo alinovi ha scritto: > Even in the fortran interface? > > Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha scritto: > >> Fields are numbered from 0. >> >> Thanks, >> >> Matt >> >> On

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Edoardo alinovi
Even in the fortran interface? Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha scritto: > Fields are numbered from 0. > > Thanks, > > Matt > > On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi > wrote: > >> Hello guys, >> >> I am getting this error while using fieldsplit: >> >> [3]PETSC ERROR:

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-09 Thread Matthew Knepley
Fields are numbered from 0. Thanks, Matt On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi wrote: > Hello guys, > > I am getting this error while using fieldsplit: > > [3]PETSC ERROR: - Error Message > -- > >

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-08 Thread Edoardo alinovi
Hello guys, I am getting this error while using fieldsplit: [3]PETSC ERROR: - Error Message -- *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local column sizes 6132 do not add up to total number of

Re: [petsc-users] On PCFIELDSPLIT and its implementation

2022-11-08 Thread Matthew Knepley
On Tue, Nov 8, 2022 at 12:05 PM Edoardo alinovi wrote: > Hello Guys, > > Thanks to your suggestions on the block matrices, my fully coupled solver > is proceeding very well! > > I am now about to take advantage of the block structure of the matrix > using PCFIELDSPLIT. I have learned a bit from

[petsc-users] On PCFIELDSPLIT and its implementation

2022-11-08 Thread Edoardo alinovi
Hello Guys, Thanks to your suggestions on the block matrices, my fully coupled solver is proceeding very well! I am now about to take advantage of the block structure of the matrix using PCFIELDSPLIT. I have learned a bit from the user manual and followed with interest this discussion in the