Hi all,

I get some randomness when solving certain equation systems with MUMPS.
When I repeatedly solve the attached equation system by ksp example
10, I get different solution vectors and therefore different residual
norms.

jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.17364e-12
jac@jac-VirtualBox:~/data/rep/petsc/src/ksp/ksp/examples/tutorials$
mpiexec -np 4 ./ex10 -f ./1.mat -rhs ./1.vec -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
Number of iterations =   1
Residual norm 4.15502e-12

It seems to be depending on a combination of number of processes and
the equation system.
I used GCC 7.2.0, Intel 16, MUMPS 5.1.1 / 5.1.2 (with & without
metis/parmetis), openMPI 2.1.2. All with the same results.
PETSc configuration is the current maint branch:
./configure --download-mumps --with-debugging=0 --COPTFLAGS="-O3"
--CXXOPTFLAGS="-O3" --FOPTFLAGS="-O3" --with-scalapack

Using "--download-fblaslapack --download-scalapack" didnt make a
difference neither.
Can anyone reproduce that issue?

Thanks and kind regards,

Volker

Attachment: mumps-eqs.tar.gz
Description: GNU Zip compressed data

Reply via email to