> On Mar 13, 2018, at 1:10 PM, Tim Steinhoff <kandanov...@gmail.com> wrote:
> 
> Thanks for your fast reply.
> I see that I can't expect the same results when changing the number of
> processes, but how does MPI change the order of operations, when there
> are for example 2 processes and the partitioning is fixed?

    Hmm, how do you know the partitioning is fixed? Is there the use of a 
random number generator in MUMPS or the partitioning packages uses?

   Barry

> With GMRES I could not prorduce that behavior, no matter how many processes.
> 
> 2018-03-13 18:17 GMT+01:00 Stefano Zampini <stefano.zamp...@gmail.com>:
>> This is expected. In parallel, you cannot assume the order of operations is
>> preserved
>> 
>> Il 13 Mar 2018 8:14 PM, "Tim Steinhoff" <kandanov...@gmail.com> ha scritto:
>>> 
>>> Hi all,
>>> 
>>> I get some randomness when solving certain equation systems with MUMPS.
>>> When I repeatedly solve the attached equation system by ksp example
>>> 10, I get different solution vectors and therefore different residual
>>> norms.
>>> 
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.15502e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.15502e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.17364e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.17364e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.17364e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.15502e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.15502e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.17364e-12
>>> jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
>>> mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
>>> -pc_type lu -pc_factor_mat_solver_package mumps
>>> Number of iterations =   1
>>> Residual norm 4.15502e-12
>>> 
>>> It seems to be depending on a combination of number of processes and
>>> the equation system.
>>> I used GCC 7.2.0, Intel 16, MUMPS 5.1.1 / 5.1.2 (with & without
>>> metis/parmetis), openMPI 2.1.2. All with the same results.
>>> PETSc configuration is the current maint branch:
>>> ./configure --download-mumps --with-debugging=0 --COPTFLAGS="-O3"
>>> --CXXOPTFLAGS="-O3" --FOPTFLAGS="-O3" --with-scalapack
>>> 
>>> Using "--download-fblaslapack --download-scalapack" didnt make a
>>> difference neither.
>>> Can anyone reproduce that issue?
>>> 
>>> Thanks and kind regards,
>>> 
>>> Volker

Reply via email to