On Thu, 29 Apr 2010 14:18:23 +0200, Lorenzo Botti <[email protected]> 
wrote:
> >
> >
> >
> > Yeah, specifying the fields with options like -pc_fieldsplit_0_field
> > 0,1,2 requires a collocated discretization where the fields at each node
> > are interlaced (this gives better performance anyway).  It won't work
> > for mixed discretizations, or where the fields are not interlaced.  In
> > that case, you should set them with PCFieldSplitSetIS().
> >
> >
> I'm not sure I understand this point.
> What do you mean with interlaced? Isn't the off-diagonal block realizing the
> inter-field coupling?

Interlaced is sometimes called nodal blocking.  The idea is that you
have [u0,v0,p0,u1,...] instead of [u0,u1,...,v0,v1,...,p0,p1,...].  The
former is only possible for non-mixed discretizations.

> I'm looking at the source code and PCSetUp_FieldSplit() is creating the IS
> according to the field splits.
> What kind of situation can I manage with with PCFieldSplitSetIS() ?

Hopefully you are looking at PETSc-3.1 (or -dev).  The implementation
treats both the cases where the split is defined in terms of the nodal
blocks, and where the index sets were provided explicitly with
PCFieldSplitSetIS() which allows you to put any set of variables in any
number of splits.  Just create an IS for each field if your variables
have a different numbering or if you want to order/group them
differently.

The special treatment for nodal blocking is only provided for the cases
where it is natural to the problem in which case it gives you a
convenient way to play around at runtime, otherwise specifying the
fields requires a problem-sized amount of information and thus cannot be
specified at runtime.

Jed

------------------------------------------------------------------------------
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to