On May 14, 2014, at 12:43 AM, Hossein Talebi <[email protected]> wrote:

> 
> Thank you.
> 
> Well, only the first part. I move around the elements and identify the Halo 
> nodes etc. However, I do not renumber the vertices to be contiguous on the 
> CPUs like what you said.

    You need to do this! Once this is done then using the PETSc solvers is 
easy. Note you can do this by simply counting the number of local vertices on 
each process and using an MPI_Scan to get the first number on each process from 
the previous process.
>  
> BUT, I just noticed: I partition the domain based on the computational wight 
> of the elements which is different to that of Mat-Vec calculation. This means 
> my portioning may not be efficient for the solution process. 

    That is fine, it is what we do to. 
> 
> I think I will then go with the copy-in, solve, copy-out option.

    I do not know what you mean here but it sounds bad.

> 
> 
> 
> 
> On Wed, May 14, 2014 at 3:06 AM, Barry Smith <[email protected]> wrote:
> 
> On May 13, 2014, at 11:42 AM, Hossein Talebi <[email protected]> wrote:
> 
> >
> > I have already decomposed the Finite Element system using Metis. I just 
> > need to have the global rows exactly like how I define and I like to have 
> > the answer in the same layout so I don't have to move things around the 
> > processes again.
> 
>    Metis tells you a good partitioning IT DOES NOT MOVE the elements to form 
> a good partitioning. Do you move the elements around based on what metis told 
> you and similarly do you renumber the elements (and vertices) to be 
> contiquously numbered on each process with the first process getting the 
> first set of numbers, the second process the second set of numbers etc?
> 
>    If you do all that then when you create Vec and Mat you should simply set 
> the local size (based on the number of local vertices on each process). You 
> never need to use PetscLayoutCreate and in fact if your code was in C you 
> would never use PetscLayoutCreate()
> 
>    If you do not do all that then you need to do that first before you start 
> calling PETSc.
> 
>    Barry
> 
> >
> > No, I don't need it for something else.
> >
> > Cheers
> > Hossein
> >
> >
> >
> >
> > On Tue, May 13, 2014 at 6:36 PM, Matthew Knepley <[email protected]> wrote:
> > On Tue, May 13, 2014 at 11:07 AM, Hossein Talebi <[email protected]> 
> > wrote:
> > Hi All,
> >
> >
> > I am using PETSC from Fortran. I would like to define my own layout i.e. 
> > which row belongs to which CPU since I have already done the domain 
> > decomposition.  It appears that  "PetscLayoutCreate" and the other routine 
> > do this. But in the manual it says it is not provided in Fortran.
> >
> > Is there any way that I can do this using Fortran? Anyone has an example?
> >
> > You can do this for Vec and Mat directly. Do you want it for something else?
> >
> >   Thanks,
> >
> >      Matt
> >
> > Cheers
> > Hossein
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their 
> > experiments is infinitely more interesting than any results to which their 
> > experiments lead.
> > -- Norbert Wiener
> >
> >
> >
> > --
> > www.permix.org
> 
> 
> 
> 
> -- 
> www.permix.org

Reply via email to