On Fri, Mar 27, 2015 at 4:28 AM, Florian Lindner <[email protected]> wrote:
> Am Donnerstag, 26. März 2015, 07:34:27 schrieb Jed Brown: > > Florian Lindner <[email protected]> writes: > > > > > Hello, > > > > > > I'm using petsc with petsc4py. > > > > > > A matrix is created like that > > > > > > MPIrank = MPI.COMM_WORLD.Get_rank() > > > MPIsize = MPI.COMM_WORLD.Get_size() > > > print("MPI Rank = ", MPIrank) > > > print("MPI Size = ", MPIsize) > > > parts = partitions() > > > > > > print("Dimension= ", nSupport + dimension, "bsize = ", > len(parts[MPIrank])) > > > > > > MPI.COMM_WORLD.Barrier() # Just to keep the output together > > > A = PETSc.Mat(); A.createDense( (nSupport + dimension, nSupport + > dimension), bsize = len(parts[MPIrank]) ) # <-- crash here > > > > bsize is collective (must be the same on all processes). It is used for > > vector-valued problems (like elasticity -- bs=3 in 3 dimensions). > > It seems I'm still misunderstanding the bsize parameter. > > If I distribute a 10x10 matrix on three ranks I need to have a > non-homogenous distribution, and thats what petsc does itself: > blockSize really means the uniform block size of the matrix, thus is HAS to divide the global size. If it does not, you do not have a uniform block size, you have a bunch of different sized blocks. Thanks, Matt > A.createDense( (n, n) ) > > print("Rank = ", rank, "Range = ", A.owner_range, "Size = ", > A.owner_range[1] - A.owner_range[0]) > print("Rank = ", rank, "ColRange = ", A.getOwnershipRangeColumn(), "Size = > ", A.getOwnershipRangeColumn()[1] - A.getOwnershipRangeColumn()[0]) > > gives: > > Rank = 2 Range = (7, 10) Size = 3 > Rank = 2 ColRange = (7, 10) Size = 3 > Rank = 0 Range = (0, 4) Size = 4 > Rank = 0 ColRange = (0, 4) Size = 4 > Rank = 1 Range = (4, 7) Size = 3 > Rank = 1 ColRange = (4, 7) Size = 3 > > > How can I manually set a distribution of rows like above? My approach was > to call create with bsize = [3,3,4][rank] but that obviously is not the > way... > > Thanks, > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
