On May 13, 2014, at 11:42 AM, Hossein Talebi <[email protected]> wrote:

> 
> I have already decomposed the Finite Element system using Metis. I just need 
> to have the global rows exactly like how I define and I like to have the 
> answer in the same layout so I don't have to move things around the processes 
> again.  

   Metis tells you a good partitioning IT DOES NOT MOVE the elements to form a 
good partitioning. Do you move the elements around based on what metis told you 
and similarly do you renumber the elements (and vertices) to be contiquously 
numbered on each process with the first process getting the first set of 
numbers, the second process the second set of numbers etc? 

   If you do all that then when you create Vec and Mat you should simply set 
the local size (based on the number of local vertices on each process). You 
never need to use PetscLayoutCreate and in fact if your code was in C you would 
never use PetscLayoutCreate()
 
   If you do not do all that then you need to do that first before you start 
calling PETSc.

   Barry

> 
> No, I don't need it for something else.
> 
> Cheers
> Hossein
> 
> 
> 
> 
> On Tue, May 13, 2014 at 6:36 PM, Matthew Knepley <[email protected]> wrote:
> On Tue, May 13, 2014 at 11:07 AM, Hossein Talebi <[email protected]> 
> wrote:
> Hi All,
> 
> 
> I am using PETSC from Fortran. I would like to define my own layout i.e. 
> which row belongs to which CPU since I have already done the domain 
> decomposition.  It appears that  "PetscLayoutCreate" and the other routine do 
> this. But in the manual it says it is not provided in Fortran. 
> 
> Is there any way that I can do this using Fortran? Anyone has an example?
> 
> You can do this for Vec and Mat directly. Do you want it for something else?
> 
>   Thanks,
> 
>      Matt
>  
> Cheers
> Hossein
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> 
> 
> -- 
> www.permix.org

Reply via email to