Steve,

    Thanks for your report and sorry about the difficulties. We made the change 
thinking the sooner we report a FP error the better for the user. But you are 
right that sometimes it makes sense for the application to handle it directly.

    We will figure out a way to manage this. I am thinking a global flag you 
can set indicating you don't want these checks performed; then you will get 
back the previous behavior.

   Barry

On Feb 22, 2013, at 5:07 PM, "Cea, Stephen M" <stephen.m.cea at intel.com> 
wrote:

> Hi,
> 
>               We are upgrading from 3.0 to 3.3 version of PETSc.  It is 
> looking good because it is considerably faster.   We have one issue that the 
> vecnorm has been changed and it sets an error if finds a nan / inf in the 
> vector.  See message at the end.    We have always exited the program when we 
> get a PETSc error but for this case it is not a good idea since  we used to 
> find this ourselves and then cut the timestep and resolve.    We might be 
> able to just treat the error from KSPSolve    ( iError = KSPSolve(slp.sles, 
> b, x); ) as non convergence and cut the time step and resolve.   But I am 
> wondering if we can tell from the return which error it is.   I tried looking 
> in the manual but couldn't find a list of error codes etc.
> 
> Thanks,
> steve
> 
> 
> 
> 
> 
> [0]PETSC ERROR: --------------------- Error Message 
> ------------------------------------
> [0]PETSC ERROR: Floating point exception!
> [0]PETSC ERROR: Infinite or not-a-number generated in norm!
> [0]PETSC ERROR: 
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 
> 2012
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: 
> ------------------------------------------------------------------------
> [0]PETSC ERROR: 
> /nfs/pdx/disks/tcad_ptm_pdmg_work_03/usr/adlilak/FLPSCHECKINS/FLPS_NZ_PETSC/floops/src/floops.linux64
>  on a with-CPP- named dlxc0894.jf.intel.com by adlilak Thu Feb 21 23:43:15 
> 2013
> [0]PETSC ERROR: Libraries linked from 
> /p/dt/sdework9/mrduda/INN/Build2/petsc-3.3-p3/with-CPP-nomumps-icc13/lib
> [0]PETSC ERROR: Configure run at Fri Oct 19 09:42:58 2012
> [0]PETSC ERROR: Configure options --with-debugging=0 --with-mpi-compilers=0 
> --with-mpi-shared=0 --with-mpi=1 
> --with-mpi-include=/p/dt/sde/tools/em64t_SLES10/MPI/4.0.3.004/include64 
> --with-mpi-lib="[/p/dt/sde/tools/em64t_SLES10/MPI/4.0.3.004/lib64/libmpi.a,/p/dt/sde/tools/em64t_SLES10/MPI/4.0.3.004/lib64/libmpiif.a]"
>  --with-clanguage=C++ --with-vendor-compilers=intel --with-cc=icc 
> --COPTFLAGS="-O2 -g -sox" --with-cxx=icpc --CXXOPTFLAGS="-O2 -g -sox" 
> --with-fc=ifort --FOPTFLAGS="-O2 -g -sox" --LDFLAGS=-Wl,-rpath,$GCCDIR/lib64 
> --with-blas-lapack-lib="[/p/dt/sdework12/mrduda/KNC/c_compiler/install/mkl/lib/intel64/libmkl_rt.so]"
>  --with-scalapack=1 
> --with-scalapack-include=/p/dt/sdework12/mrduda/KNC/c_compiler/install/mkl/include
>  
> --with-scalapack-lib=/p/dt/sdework12/mrduda/KNC/c_compiler/install/mkl/lib/intel64/libmkl_scalapack_lp64.a
>  --with-blacs=1 
> --with-blacs-include=/p/dt/sdework12/mrduda/KNC/c_compiler/install/mkl/include
>  --with-blacs-lib=/p/dt/sdework12/mrduda/KNC/c_compiler/
> install/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.a
> [0]PETSC ERROR: 
> ------------------------------------------------------------------------
> [0]PETSC ERROR: VecNorm() line 169 in 
> /p/dt/sdework9/mrduda/INN/Build2/petsc-3.3-p3/src/vec/vec/interface/rvector.c
> [0]PETSC ERROR: KSPSolve_BCGS() line 78 in 
> /p/dt/sdework9/mrduda/INN/Build2/petsc-3.3-p3/src/ksp/ksp/impls/bcgs/bcgs.c
> [0]PETSC ERROR: KSPSolve() line 446 in 
> /p/dt/sdework9/mrduda/INN/Build2/petsc-3.3-p3/src/ksp/ksp/interface/itfunc.c
> 
> 

Reply via email to