I have seen similar behavior on my mac (works fine on Linux) -- I reported this to the mailing list a few weeks back. I eventually tracked it down to a BLAS issue but gave up on finding the exact cause as I needed to move on -- moved over to MUMPS. But the problem is not in your imagination.
If I have the time, I will try to get back to it (especially since I finally learned that you have to use dsymutil to get the line numbers in the debugger/valgrind). -sanjay On 1/28/13 3:09 PM, Jed Brown wrote: > Vague "random answers" isn't very helpful. If there is a real problem, > we'd like a test case so we can track it down. > > > On Mon, Jan 28, 2013 at 3:58 PM, Gaetan Kenway > <kenway at utias.utoronto.ca <mailto:kenway at utias.utoronto.ca>> wrote: > > Hi everyone > > I have the exactly same issue actually. When I updated to > petsc-3.3, SuperLU_dist was giving me random answers to > KSPSolve(). Maybe half of the time you would get the same result > as 3.2, other times it was a little off and other times widely > differnet. I am using SuperLU_dist with a PREONLY ksp object. > > I haven't tracked down what is causing it and reverted back to > petsc-3.2 that still works. > > Also, to fix the issue with the configure below, just drop out the > download-metis. You need it for 3.3 but not 3.2 > > Gaetan > > > On Mon, Jan 28, 2013 at 4:42 PM, Jed Brown <jedbrown at mcs.anl.gov > <mailto:jedbrown at mcs.anl.gov>> wrote: > > Send -ksp_monitor_true_residual -ksp_view output for both > cases so we can try to identify the source of the different > convergence behavior. > > > On Mon, Jan 28, 2013 at 3:37 PM, Brian Helenbrook > <bhelenbr at clarkson.edu <mailto:bhelenbr at clarkson.edu>> wrote: > > Dear Petsc-Users-List, > > I recently upgraded to petsc3.3-p5 from petsc3.2-p7 and > the results from my code have changed. I am using superLU > with the following options: > > -ksp_type preonly -pc_type lu > -pc_factor_mat_solver_package superlu_dist > > Everything was working with petsc3.2 but now I get totally > different answers and the iteration doesn't converge. My > build configuration is > > ./config/configure.py --prefix=${HOME}/Packages > --with-fortran=0 --download-superlu_dist=1 --with-x=0 > --download-parmetis=1 --download-metis=1 > --with-mpi-dir=${HOME}/Packages > --with-valgrind-dir=${HOME}/Packages > > I am running on OS X 10.8.2 with openmpi-1.6.3. > > I have run valgrind on my code and it is clean (except for > start-up issues with mpi which occur before my code is > entered.) > > I'm not very sure how to go about debugging this. What > I've tried is to re-install pets-3.2-p7, but now I am > having trouble getting that to build: > > ./config/configure.py --prefix=${HOME}/Packages > --with-fortran=0 --download-superlu_dist=1 --with-x=0 > --download-parmetis=1 --download-metis=1 > --with-mpi-dir=${HOME}/Packages > --with-valgrind-dir=${HOME}/Packages > > =============================================================================== > Configuring PETSc to compile on your system > > =============================================================================== > > =============================================================================== > Compiling & installing Metis; this may take several > minutes > > =============================================================================== > > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see > configure.log for details): > > ------------------------------------------------------------------------------- > Error running make on Metis: Could not execute "cd > /Users/bhelenbr/Packages/petsc-3.2-p7/externalpackages/metis-4.0.3 > && make clean && make library && make minstall && make clean": > > > Any ideas what direction to go with this? > > Thanks, > > Brian > > > > Brian Helenbrook > Associate Professor > 362 CAMP > Mech. and Aero. Eng. Dept. > Clarkson University > Potsdam, NY 13699-5725 > > work: 315-268-2204 <tel:315-268-2204> > fax: 315-268-6695 <tel:315-268-6695> > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130128/adb4c0a9/attachment.html>
