After running my simulation multiple times on a multiprocessor computer I've 
just verified that using iterative solver (default gmres) in PETSc  to solve a 
linear system of equations ( Cx=b)  with more than 2 processors setting ALWAYS 
lead to erroneous result. Running identical code with identical setting except 
for the number of processors ( set this to 2) ALWAYS gives me correct result .

I am really not sure what is the point behind including iterative solvers if 
they result into erroneous result on a multiprocessor computer. The result I 
get from multiprocessor computer is a complete garbage, so I am really not 
talking about small percentage of error here.  Also, if somebody could 
enlighten why the iterative solvers are error prone on multiprocessors that 
will be highly appreciated. 

I am very hopeful that there is a way around to this problem, because PETSc is 
such a powerful and useful library that I really do not want to give up on this 
and start something else from scratch. 


Would you think that a DIRECT SOLVER would circumvent this problem? My problem 
is that I have a very large system of equations and the size of a sparse 
coefficient matrix is huge ( > 1e+8). I assemble this matrix in MATLAB, write 
to a binary file, and read it in PETSc. So I really need to be able to solve 
this system of equations in a cluster of computers (which inherently has 
multiprocessors and distributed memory setting). Does this mean I am completely 
out of luck with PETSc's iterative solver package and the only hope for me is 
the direct solver? I do have MUMPS downloaded and compiled with PETSc, so I 
will give that a try and see what results I obtain, but I am really surprised 
that iterative solvers are no good in a large multiprocessor settings.

Any insights, suggestions/advice will be highly appreciated.

Thanks.

PS (I can attach my entire code, plots that compare the results obtained by 
solving Cx=b in 2 processors vs 12 or 6 processors if any body wants to take a 
look at it. I get garbage if I run iterative solver on 12 processors)
                                          
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110908/cf5194cd/attachment.htm>

Reply via email to