On Thu, Dec 19, 2013 at 5:20 PM, James A Charles <[email protected]>wrote:
> My original matrix is serial. I then want to distribute this across MPI > processes for use with Arpack. MatGetSubMatrix wouldn't change > communicators right? > When you create the matrix, use PETSC_COMM_WORLD and 0 sizes on the other procs. Matt > ----- Original Message ----- > From: "Matthew Knepley" <[email protected]> > To: "James A Charles" <[email protected]> > Cc: [email protected] > Sent: Thursday, December 19, 2013 5:58:25 PM > Subject: Re: [petsc-users] Petsc Matrix Redistribution > > > > > On Thu, Dec 19, 2013 at 4:44 PM, James A Charles < [email protected] > > wrote: > > > Hello, > > I want to redistribute a matrix across MPI processes (block tridiagonal of > around size 20,000) for use with Arpack so that I can solve for eignepairs > in parallel. Is this possible and if so what is the best way to do this? > > > > The easiet way is probably > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetSubMatrix.html, > but there is also > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatPermute.html. > > > Matt > > > Thanks, > James Charles > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
