On Thu, Oct 31, 2013 at 8:41 PM, Jose David Bermeol <[email protected]>wrote:
> For instance I have a matrix of 400x8 and I want to split it in two mpi > process, one with the first 100 rows and the second one with the next 300 > rows. 1) Its not necessary to mail users and dev 2) You prescribed the range space, (100, 300), now you can prescribe the domain by dividing the columns. It not usually important Matt > ----- Original Message ----- > From: "Jose David Bermeol" <[email protected]> > To: "Matthew Knepley" <[email protected]> > Cc: "petsc-users" <[email protected]>, "petsc-dev" < > [email protected]> > Sent: Thursday, October 31, 2013 9:38:43 PM > Subject: Re: [petsc-dev] MatCreateDense > > What do you mean with number of local columns?? > > Thanks > > ----- Original Message ----- > From: "Matthew Knepley" <[email protected]> > To: "Jose David Bermeol" <[email protected]> > Cc: "petsc-dev" <[email protected]>, "petsc-users" < > [email protected]> > Sent: Thursday, October 31, 2013 2:37:43 PM > Subject: Re: [petsc-dev] MatCreateDense > > > On Thu, Oct 31, 2013 at 1:21 PM, Jose David Bermeol < [email protected]> > wrote: > > > > > Hi small question. In the method MatCreateSeqDense(MPI_Comm comm,PetscInt > m,PetscInt n,PetscScalar *data,Mat *A) I'm giving the local number of rows > and columns, so I should pass the total number of local columns or I should > pass columns/number_of_MPI process. > > > > If its sequential, you pass the total number of columns. If its > MatCreateMPIDense(), pass the number of local columns. This > is used to create a vector with the same layout as y = A^T x. > > > Matt > > > Thanks > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
