Re: [petsc-users] Get LU decomposition of a rectangular matrix

2018-03-13 Thread Natacha BEREUX
Hi Barry Thanks for your answer, I followed your suggestion (matrix type = MPIAIJ and superlu_dist on a single processor) but I still get the same problem. Rectangular matrices seem to be forbidden in MatGetOrdering(), whatever package is used for the LU decomposition Here is the output : ./e

Re: [petsc-users] Get LU decomposition of a rectangular matrix

2018-03-13 Thread Smith, Barry F.
Dang, PETSc sucks, it really shouldn't be calling MatGetOrdering() since MPAIJ can't use that information. And external packages ignore PETSc orderings anyways. I'll try to see why it is calling MatGetOrdering in this case. Barry > On Mar 13, 2018, at 7:04 AM, Natacha BEREUX wrote:

[petsc-users] Non deterministic results with MUMPS?

2018-03-13 Thread Tim Steinhoff
Hi all, I get some randomness when solving certain equation systems with MUMPS. When I repeatedly solve the attached equation system by ksp example 10, I get different solution vectors and therefore different residual norms. jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$ mpie

Re: [petsc-users] Non deterministic results with MUMPS?

2018-03-13 Thread Stefano Zampini
This is expected. In parallel, you cannot assume the order of operations is preserved Il 13 Mar 2018 8:14 PM, "Tim Steinhoff" ha scritto: > Hi all, > > I get some randomness when solving certain equation systems with MUMPS. > When I repeatedly solve the attached equation system by ksp example >

Re: [petsc-users] Non deterministic results with MUMPS?

2018-03-13 Thread Tim Steinhoff
Thanks for your fast reply. I see that I can't expect the same results when changing the number of processes, but how does MPI change the order of operations, when there are for example 2 processes and the partitioning is fixed? With GMRES I could not prorduce that behavior, no matter how many proc

Re: [petsc-users] Non deterministic results with MUMPS?

2018-03-13 Thread Tim Steinhoff
Thanks for your fast reply. I see that I can't expect the same results when changing the number of processes, but how does MPI change the order of operations, when there are for example 2 processes and the partitioning is fixed? With GMRES I could not prorduce that behavior, no matter how many proc

Re: [petsc-users] Non deterministic results with MUMPS?

2018-03-13 Thread Smith, Barry F.
> On Mar 13, 2018, at 1:10 PM, Tim Steinhoff wrote: > > Thanks for your fast reply. > I see that I can't expect the same results when changing the number of > processes, but how does MPI change the order of operations, when there > are for example 2 processes and the partitioning is fixed?

Re: [petsc-users] Get LU decomposition of a rectangular matrix

2018-03-13 Thread Smith, Barry F.
For your purposes you should just use superlu or superlu_dist directly in your code not through PETSc. Barry There is nothing to be gained by doing it through PETSc. This is not relevant for your use case but I have noted that the unneeded use of MatGetOrdering() should be eliminated ht