On 19 Dec 2014, at 17:12, Jed Brown <j...@jedbrown.org> wrote:

> Lawrence Mitchell <lawrence.mitch...@imperial.ac.uk> writes:
> 
>> Dear petsc-users,
>> 
>> I'm trying to setup matrices and data structures for use with 
>> MatGetLocalSubMatrix, but I'm rather lost in a world of ISes and block 
>> sizes.  I have the layouts and so forth correct where all my fields have 
>> block size 1, but am struggling to follow how things fit together for block 
>> size > 1.
>> 
>> I have a Taylor-Hood discretisation so a P2 velocity space with block size 
>> 2, and a P1 pressure space with block size 1.
>> 
>> On each process, I build the full local to global mapping for both fields.  
>> This has block size 1.
> 
> How are you ordering the fields on each process?


field_0_proc_0, field_1_proc_0, ..., field_N_proc_0; field_1_proc_0, ..., ...; 
... field_N_proc_P


> 
>> Then I create strided ISes to define the local blocks for each field, and 
>> set the block size on each of them (2 for the velocity space, 1 for the 
>> pressure).  Aside, when I do ISCreateStride for an IS with a block size > 1, 
>> do I provide all the indices, or just the blocked indices?  Should I be 
>> using ISCreateBlock for block size > 1 instead?
> 
> ISSTRIDE has no concept of block size and can't be used to describe
> blocked index sets.  Use ISBLOCK instead.

What is ISSetBlockSize for then?  Just hanging information on the IS for use 
elsewhere?


>> Calling MatGetLocalSubMatrix results in an error:
>> 
>> [0]PETSC ERROR: --------------------- Error Message 
>> --------------------------------------------------------------
>> [0]PETSC ERROR: Petsc has generated inconsistent data
>> [0]PETSC ERROR: Blocksize of localtoglobalmapping 1 must match that of 
>> layout 2
> 
> Hmm, I'm concerned that this might not work right after some recent
> changes to ISLocalToGlobalMapping.  Can you reproduce this with some
> code I can run?

I'll try and put something together.

Thanks,

Lawrence

Attachment: signature.asc
Description: Message signed with OpenPGP using GPGMail

Reply via email to