How can I tell if numerical noise is actually noise, or if it is
indicative of a bug?  For example, with one or two OS/processor
combinations, I get this (from chmm.pyx):

    sage: m.viterbi([0,1,10,10,1])
Expected:
    ([0, 0, 1, 1, 0], -9.0604285688230899)
Got:
    ([0, 0, 1, 1, 0], -9.0604285688230917)

Can I tell how accurate this is actually supposed to be?  I can
certainly just change the doctest to

    ([0, 0, 1, 1, 0], -9.0604285688230...)

but if the code is actually supposed to be accurate to a few more
decimal places, this is concealing a bug.  Sometimes I can look at the
code and easily figure out its supposed accuracy, but more frequently
I can't.  So what should be done in cases like this?

--
John

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to