-----Original Message-----
From: CARLETON GARRISON <[EMAIL PROTECTED]>
To: Daran <[EMAIL PROTECTED]>; [EMAIL PROTECTED] <[EMAIL PROTECTED]>
Date: 24 July 2001 00:18
Subject: Re: Mersenne: P-1

>Daran,
>
>    Again I'm not the most qualified person to reply but...
>
>    The name of the game is validate - by duplication.  You cannot make a
>case without duplicating the result.  This is to safeguard against the many
>gremlins that can occur - faulty overclocked CPUs, etc.  If someone finds a
>factor, someone ofer a PrimeNet can do a single division case and confirm
>the result.

Yes.  Verifying a factor is trivial.

>If not, then it is my understanding a first time LL will also
>do factoring.

As far as I can tell, both first time and DC LLs do P-1 factoring, but not
trial factoring.  (I suspect they will trial factor first if the exponant
has not already been trial factored far enough, but I don't recall
encountering this.)  However, it is likely that most of the exponants within
the current DC range were first time LL checked before P-1 factoring was
introduced.  It is possible that once exponants are reached that were P-1
checked the first time round, they might not be checked again.

>Whether the first time comes back prime or composite, then
>PrimeNet knows that two factorings confirm no factors to a certain
>boundary - say 2^60.

P-1 factoring uses boundaries in a different way, i.e searches a different
space (that of smooth factors) from trial factoring.

It's a question of probabilities.  If P-1 factoring takes - say - 2% of the
time to do an LL, and a success eliminates the need for first time /and/
double check LLs, then it's worth doing one if it has a sucess rate of more
than 1%.  For the P-1 tests I've done, the estimates for the probability of
success ranged from 2-5%, depending upon how much memory I allowed it.
(I've not done enough to confirm the accuracy of these estimates, but I've
been told they're good).

However, if the historical error rate for P-1s is the same as for LLs,
(which is a reasonable assumption, given that they use the same FFT code*),
then the probability of a second P-1 (using the same bounds) finding a
factor that a first misses will be about 1% of this, i.e 0.02-0.05%, and
only a single DC would be saved.  Clearly not economical.

[...]

>    Saying all that doesn't mean we couldn't break out work into more work
>unit types, it would just mean we'd also have to have a factoring DC work
>unit type if it was to be removed from the LL runs.

No, for the reason given above.

>The question boils down
>to whether enough people would like to do factoring DCs.

It boils down to whether people with lots of memory (which is likely to be
the newer faster machines) would be willing to do some P-1 factoring on it's
own, and whether the gains would be worth the programming effort.

This computer is a Duron running at 943MHz - hardly cutting edge speed -
with 512MB RAM, which is ample for stage 2 P-1s.  There must be many out
there - particularly those doing DCs - which don't have enough RAM, or are
running older clients.  I'd be happy to do nothing but P-1s on these
exponants that would otherwise never be tested in this way.

>Carleton

Daran

*A counterargument is that P-1 can use huge amounts of ram - hundreds of
megabytes - and so might encounter memory errors that LLs don't.


_________________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to