Each process calls FieldSplitSetIS() with the indices for THAT FIELD that are owned by that process. Each process call FieldSplitSetIS() n+1 times if you have n+1 fields. Identifying fields with processes is a mistake.
Matt On Wed, Mar 9, 2011 at 1:50 PM, Thomas Witkowski < Thomas.Witkowski at tu-dresden.de> wrote: > As I already asked on petsc-users, I want to implement some kind of > iterative substructuring algorithm in my fem code. It was suggested to me to > switch to the dev version of petsc and to make use of PCFieldSplit. So far I > have installed petsc-dev and read a little bit about the PCFieldSplit. It > sounds great. But I'm not really sure how to make use of. In my code I want > to build n blocks (where n is also the number of processors), each for the > interior domain of one rank. Okay, this seems to be easy. I make just one > call to PCFieldSplitSetIS on each rank with IS being the global indices of > the ranks interior nodes. But what about the n+1 block, which should contain > all the nodes of the boundaries between the subdomains? Each rank > contributes to this block. So how show PCFieldSplitSetIS should be called? > May be some small example: Assume we have two ranks, each with 100 nodes in > its interior domain and each rank contributes with 10 nodes to the interior > boundary (so the overall interior boundary contains 20 nodes). So rank 0 > owns global indices 0 to 109, with 100 to 109 being the nodes of the first > part of the interior boundary), and rank 1 owns global indices 110 to 219, > with 210 to 219 being the nodes of the second part of the interior boundary. > When I understood the idea behind PCFieldSplit correctly, it should be > possible to generate the three blocks [0-99],[110-209],[100-109,210-219] > using PCFieldSplitSetIs. But how to call it correctly? > > Thomas > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110309/b2496736/attachment.html>
