Using parMetis in petsc for ordering

2007-01-11 Thread Dimitri Lecas
Barry Smith a ?crit : 1) The PETSc LU and Cholesky solvers only run sequentially. 2) The parallel LU and Cholesky solvers PETSc interfaces to, SuperLU_dist, MUMPS, Spooles, DSCPACK do NOT accept an external ordering provided for them. Hence we do not have any setup for

Using parMetis in petsc for ordering

2007-01-11 Thread Dimitri Lecas
Barry Smith a ?crit : Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is likely just to make the LU factorization have much more fill and require much more

Using parMetis in petsc for ordering

2007-01-11 Thread Barry Smith
On Thu, 11 Jan 2007, Dimitri Lecas wrote: Barry Smith a ?crit : Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is likely just to make the LU

Using parMetis in petsc for ordering

2007-01-11 Thread Dimitri Lecas
Barry Smith a ?crit : On Thu, 11 Jan 2007, Dimitri Lecas wrote: Barry Smith a ?crit : Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is

Using parMetis in petsc for ordering

2007-01-11 Thread Matthew Knepley
Reordering a matrix can result in fewer iterations for an iterative solver. Matt On 1/11/07, Dimitri Lecas dimitri.lecas at free.fr wrote: Barry Smith a ?crit : On Thu, 11 Jan 2007, Dimitri Lecas wrote: Barry Smith a ?crit : Dimitri, No, I think this is not the

Using parMetis in petsc for ordering

2007-01-11 Thread Barry Smith
In parallel matrix-vector products (used by all the KSP methods) the amount of communication is the number of cut-edges of the graph of the matrix. By repartitioning with metis this reduces the number of cut edges. Note: we don't actually advocate doing it this way. One should partition the

Using parMetis in petsc for ordering

2007-01-10 Thread Barry Smith
Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is likely just to make the LU factorization have much more fill and require much more floating operations).

Using parMetis in petsc for ordering

2007-01-08 Thread Dimitri Lecas
Hello, I have to test the ParMetis ordering for factorization and i would like to known if it's possible to use a user ordering ? If i understand the manual correctly, i have to use MatOrderingRegisterDynamic and PCFactorSetMatOrdering but the sentence Currently we support orderings only for

Using parMetis in petsc for ordering

2007-01-08 Thread Barry Smith
1) The PETSc LU and Cholesky solvers only run sequentially. 2) The parallel LU and Cholesky solvers PETSc interfaces to, SuperLU_dist, MUMPS, Spooles, DSCPACK do NOT accept an external ordering provided for them. Hence we do not have any setup for doing parallel matrix