> We should try to increase this probability before starting any
> full LL test, perhaps by using better factoring algorithms.
> Will v19 have P-1 code that works with numbers of this size?
> But even with P-1, we might not be able to factor that much
> further. The time consuming part of P-1 are multiplications
> mod Mp just as in the LL test, so the crossover point between
> raising the factoring limit and starting the LL test will be a function
> of p (the number of interations in the LL test) and less than linear,
> that is, we might not get very far.
> Do we have estimates already what the optimal bound
> values will be for trying P-1 on a given Mp, to minimize the
> (time of P-1 test) + (time of LL test * chance of not finding any factor) ?
As I understand the Pollard P-1 algorithm, the time required for 1 bit of
the exponent would be equal to 1 iteration of the LL test (one squaring
mod Mp), so values wouldn't be that hard to create. The main problem is
that to get a bound of any reasonable size, and try multiple bases, you
are looking at a time comperable to the LL test. Also, one of the keen
things about GIMPS is that in its search for primes, it came up with loads
of factoring data for Mersennes. However, the P-1 algorithm would make any
factor distribution data a lot less usable (as it would not include P with
P-1 having a large prime factor).
On the other hand, the P-1 algorithm can take advantage of the special
form of Mersenne factors. We know that P-1 must be divisable by the
Mersenne exponent, and that in 1/2 of the cases, it is divisable by 8.
-Lucas
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers