"Steinar H. Gunderson" <[EMAIL PROTECTED]> writes
>On Thu, Jul 29, 1999 at 01:31:13PM -0600, Aaron Blosser wrote:
>>Egads...I'm not telling *anyone* to NOT use slow computers...I'm
>>"suggesting" they use < p166 for double-checks or factoring.
...
>>I hope I'm being clear that I don't consider slower computers useless
>>entirely...just useless for doing LL tests in the range of exponents
>>currently being assigned.

>Not useless, just slow. I still think it makes perfect sense.


I remember discussing a while back the problem of undetected errors. I
came to the conclusion that, for example, the probability of getting a
6000000-ish exponent wrong is about 1/20. I assumed that the error rate
is proportional to number of cycles in the LL test. 

Then someone suggested that it is more realistic for the error rate to
be proportional to the duration of the run. It could be a serious
problem for slow computers that take months and months to do a single LL
test. 

An undetected bit error in the wrong place would invalidate the entire
run. On the above assumption the chance of this happening is inversely
proportional to the speed of the computer. For slow computers it could
therfore be considerably worse than the 1/20 I estimated.

On the other hand, factoring is relatively immune because there is not
the long-range interdependence that exists with the LL test. Also the
job of factoring is to find factors. There are no 'negative' results to
collect. It is not a disaster if the occasional factor is missed.

However, it was also pointed out that old processors and memory are more
reliable. 

-- 
Tony
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to