Dear Matt and Dear Barry,

I have some follow up questions regarding FieldSplit.
Let's assume that I solve again the 3D Stokes flow but now I have also a global 
constraint that controls the flow rate at the inlet. Now, the matrix has the 
same unknowns as before, i.e. ux0,uy0,uz0,p0//ux1,uy1,uz1,p1//..., but the last 
line (and the last column) corresponds to the contribution of the global 
constraint equation. I want to incorporate the last line (and last column)  
into the local block of velocities (split 0) and the pressure. The problem is 
how I do that. I have two questions:

  1.  Now, the block size should be 5 in the matrix and vector creation for 
this problem?
  2.  I have to rely entirely on PCFieldSplitSetIS to create the two blocks? 
Can I augment simply the previously defined block 0 with the last line of the 
matrix?

Up to this moment, I use the following commands to create the Field split:
ufields(3) = [0, 1, 2]
pfields(1) = [3]

call PCSetType(pc, PCFIELDSPLIT, ierr)
call PCFieldSplitSetBlockSize(pc, 4,ierr)
 call PCFieldSplitSetFields(pc, "0", 3, ufields, ufields,ierr)
 call PCFieldSplitSetFields(pc, "1", 1, pfields, pfields,ierr)

Thanks,
Pantelis


________________________________
Από: Matthew Knepley <knep...@gmail.com>
Στάλθηκε: Παρασκευή, 19 Ιανουαρίου 2024 11:31 μμ
Προς: Barry Smith <bsm...@petsc.dev>
Κοιν.: Pantelis Moschopoulos <pmoschopou...@outlook.com>; 
petsc-users@mcs.anl.gov <petsc-users@mcs.anl.gov>
Θέμα: Re: [petsc-users] Question about a parallel implementation of PCFIELDSPLIT

On Fri, Jan 19, 2024 at 4:25 PM Barry Smith 
<bsm...@petsc.dev<mailto:bsm...@petsc.dev>> wrote:

   Generally fieldsplit is used on problems that have a natural "split" of the 
variables into two or more subsets. For example u0,v0,u1,v1,u2,v2,u3,v4 This is 
often indicated in the vectors and matrices with the "blocksize" argument, 2 in 
this case. DM also often provides this information.

   When laying out a vector/matrix with a blocksize one must ensure that an 
equal number of of the subsets appears on each MPI process. So, for example, if 
the above vector is distributed over 3 MPI processes one could use   
u0,v0,u1,v1       u2,v2      u3,v3  but one cannot use u0,v0,u1    v1,u2,v2   
u3,v3.  Another way to think about it is that one must split up the vector as 
indexed by block among the processes. For most multicomponent problems this 
type of decomposition is very natural in the logic of the code.

This blocking is only convenient, not necessary. You can specify your own field 
division using PCFieldSplitSetIS().

  Thanks,

     Matt

  Barry


On Jan 19, 2024, at 3:19 AM, Pantelis Moschopoulos 
<pmoschopou...@outlook.com<mailto:pmoschopou...@outlook.com>> wrote:

Dear all,

When I am using PCFIELDSPLIT and pc type "schur" in serial mode everything 
works fine. When I turn now to parallel, I observe that the number of ranks 
that I can use must divide the number of N without any remainder, where N is 
the number of unknowns. Otherwise, an error of the following form emerges: 
"Local columns of A10 3473 do not equal local rows of A00 3471".

Can I do something to overcome this?

Thanks,
Pantelis



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>

Reply via email to