Xiangdong <[email protected]> writes: > For example, if the values on the 4-by-4 grid are [1,2,3,4; 5,6,7,8; > 9,10,11,12; 13,14,15,16]. If I use 4 processors and set m=2, n=2 (or use > petsc_decide), then on processor zero, the local portion of the global > vector is 1,2,3,4
No, PETSc global ordering is different from natural. There is a detailed picture of this in the users manual and in most PETSc tutorials. Please read that. > while the local vector has value 1,2,5,6. On processor one, the local > portion of the global vector is 5,6,7,8; and the local vector is > 3,4,7,8. It looks like the global is natural order, while local vector > is petsc order. No. > > > >> >> > 2) DOF. In each cell, I have two unknowns, say ux and uy. One way is to >> > store them using one global vector with dof=2. The other way is to create >> > two global vectors for each ux and uy with dof=1. Is one approach better >> > than the other? >> >> The former is better for memory streaming unless your operations >> traverse the grid using only one at a time (and then, it would be better >> to rephrase to traverse fewer times, using both values each time). >> > > Any examples in petsc tutorials demonstrating the case dof>1? I found most > of them are dof=1. For dof>1, are the values stored in a interleaved > manner? src/snes/examples/tutorials/ex48.c uses dof=2 and MatSetValuesBlockedStencil.
pgpn0zbWduqxH.pgp
Description: PGP signature
