On May 14, 2011, at 3:31 PM, Dmitry Karpeev wrote:
> On Sat, May 14, 2011 at 3:28 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>> On May 14, 2011, at 3:09 PM, Jed Brown wrote:
>>
>>> On Sat, May 14, 2011 at 21:47, Dmitry Karpeev <karpeev at mcs.anl.gov>
>>> wrote:
>>> Can these split ISs have communicators different from that of
>>> PetscLayout (this is something I use in GASM,
>>> for example)?
>>>
>>> I guess I always thought of the field information in PetscLayout as having
>>> semantic meaning to the user. I also thought these ISs would be strictly
>>> non-overlapping and addressing locally owned values.
>>
>> Dmitry asked: What will be the meaning of overlapping splits?
>>
>> Matt wrote: boundary conditions (if you want).
>>
>> Jed wrote later: think that has different semantic meaning and thus
>> deserves a different mechanism. I don't know whether it belongs in
>> PetscLayout, but I think it is a different beast from defining "fields" that
>> have semantic meaning to the user.
>
> Via Vec/MatSetValuesLocal? I would agree with that.
> What I don't like about that is that now there will be two different
> ways of treating splits that happen to be nonoverlapping:
> via some clever XXXSetValuesLocal and via the nonoverlapping splits
> mechanism inside PetscLayout.
This is a good observation, rather than putting the localtoglobal mappings
onto Vecs and Mats we could put them onto the PetscLayout inside the Vec/Mat,
this would simplify the current code a bit and get things like MatGetVecs() to
properly pass local to global information to the created vectors. So for
example we could have something like
typedef struct _n_PetscLayout* PetscLayout;
struct _n_PetscLayout{
MPI_Comm comm;
PetscInt n,N; /* local, global vector size */
PetscInt rstart,rend; /* local start, local end + 1 */
PetscInt *range; /* the offset of each processor */
PetscInt bs; /* number of elements in each block (generally for
multi-component problems) Do NOT multiply above numbers by bs */
PetscInt refcnt; /* MPI Vecs obtained with VecDuplicate() and from
MatGetVecs() reuse map of input object */
PetscInt nfields; IS *fields;
IS localtoglobalmapping,blocklocaltoglobalmapping.
USER INFO LIKE boundary condition locations;
};
We could put the localtoglobalmappings into the PetscLayout independent of
the other changes we've been discussing.
Barry
>
>
>
> Dmitry.
>>
>>
>>
>> Barry responds: we could require non-overlapping splits but I am not sure if
>> that is necessary or desirable. For example, the first IS might denote
>> velocity variables, the second IS might denote pressure variables the third
>> IS might denote locations with fixed boundary conditions. All of these
>> things have "semantic meaning" to the users. Only some of them may have
>> meaning to the PCFieldSplit. If we could use the same mechanism for all of
>> them that might be better than having different mechanisms for "marking"
>> different "meanings".
>>
>> Barry
>>
>>
>>
>>
>>> In that context, there is no particular disadvantage to always using global
>>> ISs, and I think it would help keep the code simple. If you are going to
>>> solve part of the problem on a sub-communicator, you can define that
>>> subcomm as all those processes that have non-empty local part in the IS. If
>>> you have a huge number of splits (such that individual splits on longer
>>> have semantic meaning), then I think it is a different purpose.
>>
>>
>>
>>
>>
>>