Ken Kriesel wrote : > For a hypothetical pool of machines of constant reliability, > the error rate per trillion operations (or should that be per > hour of 100% cpu utilization?) would be about constant. > The number of operations goes up slightly faster than the > square of the exponent, so twice the exponent means more than > 4 times the error rate per exponent. (More iterations, > performed in more pieces, of slightly lower precision)
George Woltman increased the reliability of the software to decrease the error rate (more checking during the LL tests...) > It's also possible to have an error in factoring attempts > yield a false positive for a factor that fails verification. Factors are verified when submitted, according to a post by George Woltman on the Mersenne forum. What could happen are missed factors due to hardware errors. > If I recall correctly, verification of a Mersenne prime is > run on different software on a different computer > architecture to maximize the chance of a software bug or > computer hardware design error from yielding a false > positive. During the QA effort, comparisons of interim LL > test residues were made among differing software run on > differing computer architectures. This is only done when a LL test finds an exponent yielding a prime. Otherwise double-checks are accepted even if done by the same machine with the same software version if the different shift gives the same 16 byte residue. Jacob _______________________________________________ Prime mailing list [email protected] http://hogranch.com/mailman/listinfo/prime
