Using parMetis in petsc for ordering

2007-01-11 Thread Dimitri Lecas
Barry Smith a ?crit : 1) The PETSc LU and Cholesky solvers only run sequentially. 2) The parallel LU and Cholesky solvers PETSc interfaces to, SuperLU_dist, MUMPS, Spooles, DSCPACK do NOT accept an external ordering provided for them. Hence we do not have any setup for

undefined reference to ....

2007-01-11 Thread Ben Tay
,static library. I wonder if it is a problem with mkl em64t or there's something wrong with my code/compilation. Thank you. -- next part -- An HTML attachment was scrubbed... URL: http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20070111/71ebd003

undefined reference to ....

2007-01-11 Thread Ben Tay
em64t or there's something wrong with my code/compilation. Thank you. -- next part -- An HTML attachment was scrubbed... URL: http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20070111/99b049bd/attachment.htm

Using parMetis in petsc for ordering

2007-01-11 Thread Dimitri Lecas
Barry Smith a ?crit : Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is likely just to make the LU factorization have much more fill and require much more

Using parMetis in petsc for ordering

2007-01-11 Thread Barry Smith
On Thu, 11 Jan 2007, Dimitri Lecas wrote: Barry Smith a ?crit : Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is likely just to make the LU

Using parMetis in petsc for ordering

2007-01-11 Thread Dimitri Lecas
Barry Smith a ?crit : On Thu, 11 Jan 2007, Dimitri Lecas wrote: Barry Smith a ?crit : Dimitri, No, I think this is not the correct way to look at things. Load balancing the original matrix is not neccessarily a good thing for doing an LU factorization (in fact it is

Visual Studio compiler and PETSc

2007-01-11 Thread Satish Balay
We don't have prebuild binaries.. Sugets configuring with: config/configure.py --with-cc='win32fe cl' --with-cxx='win32fe cl' --with-fc=0 --with-clanguage=cxx --download-c-blas-lapack=1 If you encounter problems - send us configure.log at petsc-maint at mcs.anl.gov Satish On Fri, 12 Jan

Using parMetis in petsc for ordering

2007-01-11 Thread Matthew Knepley
Reordering a matrix can result in fewer iterations for an iterative solver. Matt On 1/11/07, Dimitri Lecas dimitri.lecas at free.fr wrote: Barry Smith a ?crit : On Thu, 11 Jan 2007, Dimitri Lecas wrote: Barry Smith a ?crit : Dimitri, No, I think this is not the

Using parMetis in petsc for ordering

2007-01-11 Thread Barry Smith
In parallel matrix-vector products (used by all the KSP methods) the amount of communication is the number of cut-edges of the graph of the matrix. By repartitioning with metis this reduces the number of cut edges. Note: we don't actually advocate doing it this way. One should partition the