On Feb 14, 2013, at 12:32 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Feb 14, 2013 at 1:22 PM, Chekuri Choudary <cchoudary at 
> rnet-tech.com> wrote:
>  
> 
> How can I view the local matrices on each processor. I am trying to 
> understand how a DMDA matrix is being distributed among multiple processes.
> 

   Consider 2d: the domain is split into rectangular subregions (depending on 
how many processors there are and how many grid points in the x and y 
directions). 
Each process is assigned one of these subregions. For any run you can use 
DMView(da,PETSC_VIEWER_STDOUT_(comm)); to see the decomposition.

Now as always with PETSc vectors all the vector entries on the first process 
are numbered first, followed by the vector entries on the second, third etc. 
Similarly for matrices the rows associated with the vector entries on the first 
process are numbered first, then the rows associated with vector entries on the 
second etc.  Thus the entries in the vector are not in the "natural ordering 
for the entire domain". But VecView() and MatView() for DMDA automatically 
reorder so that when you see the vectors and matrices they are in the natural 
ordering. 

  You are over thinking things if you want to see the matrix in the 
"non-natural ordering", there is no reason to worry about that. Just use 
MatSetValuesLocal() or MatSetValuesStencil() to enter entries and the fact that 
PETSc internally uses this other non-natural ordering is not important in using 
it.

   What particular issue are you trying to resolve by "viewing the local 
matrices on each process"?

   Barry

> 
> Rows are contiguous so viewing the global matrix shows you the local pieces.
> 
>    Matt
>  
> 
> Thanks
> 
> Chekuri
> 
>  
> 
>  
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener

Reply via email to