No, it uses the exact layout you provided. You can use https://petsc.org/release/manualpages/PC/PCREDISTRIBUTE/#pcredistribute to have the solver redistribute the rows to have an equal number per MPI process during the solve process, which will give you the effect you are looking for.
Barry > On Feb 8, 2024, at 4:07 AM, Maruthi NH <[email protected]> wrote: > > Hi Barry, > Thanks. Yes, the global column index was wrong. I have one more question > regarding MatCreateMPIAIJWithArrays. If I have 100 elements in rank 0 and 50 > in rank 1, does PETSc redistribute equally among procs before solving? > > Regards, > Maruthi > > On Mon, Feb 5, 2024 at 2:18 AM Barry Smith <[email protected] > <mailto:[email protected]>> wrote: >> >> Is each rank trying to create its own sequential matrix with >> MatCreateSeqAIJWithArrays() or did you mean MatCreateMPIAIJWithArrays()? >> >> If the latter, then possibly one of your size arguments is wrong or the >> indices are incorrect for the given sizes. >> >> Barry >> >> >> > On Feb 4, 2024, at 3:15 PM, Maruthi NH <[email protected] >> > <mailto:[email protected]>> wrote: >> > >> > Hi all, >> > >> > I have a row, col, and A values in CSR format; let's say rank 0 has 200 >> > unknowns and rank 1 has 100 unknowns. If I use MatCreateSeqAIJWithArrays >> > to create a Matrix, it crashes. However, if each rank has an equal number >> > of unknowns, it works fine. Please let me know how to proceed >> > >> > >> > Regards, >> > Maruthi >>
