Florian Lindner <[email protected]> writes:

> Hello,
>
> I'm using petsc with petsc4py.
>
> A matrix is created like that
>
>     MPIrank = MPI.COMM_WORLD.Get_rank()
>     MPIsize = MPI.COMM_WORLD.Get_size()
>     print("MPI Rank = ", MPIrank)
>     print("MPI Size = ", MPIsize)
>     parts = partitions()
>     
>     print("Dimension= ", nSupport + dimension, "bsize = ", 
> len(parts[MPIrank]))
>
>     MPI.COMM_WORLD.Barrier() # Just to keep the output together
>     A = PETSc.Mat(); A.createDense( (nSupport + dimension, nSupport + 
> dimension), bsize = len(parts[MPIrank]) ) # <-- crash here

bsize is collective (must be the same on all processes).  It is used for
vector-valued problems (like elasticity -- bs=3 in 3 dimensions).

Attachment: signature.asc
Description: PGP signature

Reply via email to