Hello, I'm using petsc with petsc4py.
A matrix is created like that MPIrank = MPI.COMM_WORLD.Get_rank() MPIsize = MPI.COMM_WORLD.Get_size() print("MPI Rank = ", MPIrank) print("MPI Size = ", MPIsize) parts = partitions() print("Dimension= ", nSupport + dimension, "bsize = ", len(parts[MPIrank])) MPI.COMM_WORLD.Barrier() # Just to keep the output together A = PETSc.Mat(); A.createDense( (nSupport + dimension, nSupport + dimension), bsize = len(parts[MPIrank]) ) # <-- crash here Output for mpirun -n 2 is like that and it works: MPI Rank = 0 MPI Size = 2 Dimension= 10 bsize = 5 MPI Rank = 1 MPI Size = 2 Dimension= 10 bsize = 5 But for mpirun -n 3 it crashes: MPI Rank = 2 MPI Size = 3 Dimension= 10 bsize = 3 MPI Rank = 1 MPI Size = 3 Dimension= 10 bsize = 4 MPI Rank = 0 MPI Size = 3 Dimension= 10 bsize = 3 Error is ValueError: global size 10 not divisible by block size 3. I tried to dig a bit into the source. Mat_Create uses Mat_Dense to unpack the size and bsize parameters. When there just a scalar n, not a tuple t given as parameter it assumes t = (n, n). (Right?). That should be fine. When I omit bsize, and mpirun -n 3, A.owner_range returns (4, 7) (0, 4) (7, 10) Which essentially is the same partioning that I manually set (I need to set partioning manually). Also the sum of bsizes equals the dimension. What is the problem here? Thanks, Florian