On Thu, 22 Apr 2010 10:21:26 -0400, Boyce Griffith <[email protected]> 
wrote:
> 
> 
> On 4/22/10 9:56 AM, Jed Brown wrote:
> > On Thu, 22 Apr 2010 09:32:12 -0400, Boyce Griffith<[email protected]>  
> > wrote:
> >> How should one setup the index set?  Is it just a mapping from Vec index
> >> to field index?  E.g., each Vec entry is assigned to field 0 or 1 (or 2
> >> or ...)?
> >
> > If your variables are interlaced like
> >
> >    u0,v0,w0,u1,v1,w1,...
> 
> In the present context, the order of variables in memory is whatever is 
> determined by libMesh --- and I haven't yet looked to see what that is. 
>   It's my impression that variables are not interlaced like this, but I 
> could be mistaken.

Right, but Roy said this:

    Block variable ordering with "--node_major_dofs" is perhaps the least
    well-documented option in libMesh, which is saying a lot, but we're
    already using it.

> > Then you can (a) set a block size for the matrix with MatSetBlockSize,
> > (b) use BAIJ in which case the block size is built in, or (c) use
> > -pc_fieldsplit_block_size.  Then to define the splits, use
> >
> >    -pc_fieldsplit_0_fields 0,3 -pc_fieldsplit_1_fields 1 
> > -pc_fieldsplit_2_fields 2,4,5
> >
> > to split six fields into three splits of sizes 2,1,3.
> 
> In this case, is field N the set of global Vec indices for which idx%6 == N?

Yes

> >    split0: 0 3 6 9       | 12 15 18 21
> >    split1: 1 7           | 13 19
> >    split2: 2 4 5 8 10 11 | 14 16 17 20 22 23
> >
> > I hope this is clear.
> 
> Sorry for being dense, but in this case, if I were to create the IS 
> manually using ISCreateGeneral, on process 0, I think that I would use:
> 
>     idx[0] = 0
>     idx[1] = 1
>     idx[2] = 2
>     idx[3] = 0
>     ...
> 
> and similarly on process 1:
> 
>     ...
>     idx[4] = idx[16-12] = 2
>     idx[5] = idx[17-12] = 2
>     idx[6] = idx[18-12] = 0
>     idx[7] = idx[19-12] = 1
>     ...

To create the index set for split0:

  /* This assumes communicator has size 2 */
  PetscInt idx[2][] = {{0,3,6,9},{12,15,18,21}};
  int      rank;
  IS       is0;

  MPI_Comm_rank(comm,rank);
  ISCreateGeneral(comm,4,idx[rank],&is0);

This is a parallel index set containing the global indices of the dofs
in the desired split.

Jed

------------------------------------------------------------------------------
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to