Hi Barry
Thanks for your answer,
I followed your suggestion (matrix type = MPIAIJ and superlu_dist on a
single processor) but I still get the same problem.
Rectangular matrices seem to be forbidden in MatGetOrdering(), whatever
package is used for the LU decomposition
Here is the output :
./e
Dang, PETSc sucks, it really shouldn't be calling MatGetOrdering() since
MPAIJ can't use that information. And external packages ignore PETSc orderings
anyways.
I'll try to see why it is calling MatGetOrdering in this case.
Barry
> On Mar 13, 2018, at 7:04 AM, Natacha BEREUX wrote:
Hi all,
I get some randomness when solving certain equation systems with MUMPS.
When I repeatedly solve the attached equation system by ksp example
10, I get different solution vectors and therefore different residual
norms.
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpie
This is expected. In parallel, you cannot assume the order of operations is
preserved
Il 13 Mar 2018 8:14 PM, "Tim Steinhoff" ha scritto:
> Hi all,
>
> I get some randomness when solving certain equation systems with MUMPS.
> When I repeatedly solve the attached equation system by ksp example
>
Thanks for your fast reply.
I see that I can't expect the same results when changing the number of
processes, but how does MPI change the order of operations, when there
are for example 2 processes and the partitioning is fixed?
With GMRES I could not prorduce that behavior, no matter how many proc
Thanks for your fast reply.
I see that I can't expect the same results when changing the number of
processes, but how does MPI change the order of operations, when there
are for example 2 processes and the partitioning is fixed?
With GMRES I could not prorduce that behavior, no matter how many proc
> On Mar 13, 2018, at 1:10 PM, Tim Steinhoff wrote:
>
> Thanks for your fast reply.
> I see that I can't expect the same results when changing the number of
> processes, but how does MPI change the order of operations, when there
> are for example 2 processes and the partitioning is fixed?
For your purposes you should just use superlu or superlu_dist directly in
your code not through PETSc.
Barry
There is nothing to be gained by doing it through PETSc.
This is not relevant for your use case but I have noted that the unneeded use
of MatGetOrdering() should be eliminated
ht