> George Woltman wrote:
> 
> > The optimal breakeven point for factoring vs. LL testing
> > changes every time the factoring code or LL code changes.
> > In the old days, the breakeven point was 2^56, now its 2^57.
> 
> Does that take into account the fact that a factor found does not
> require time to check, whereas a LL residue needs to be
> computed twice?

Interesting.

Actually double-checking factors found *is* required, but it's *very* 
cheap since you can just do the trial division, omitting all the failed 
attempts.

The optimum amount of factoring to do is (obviously?) to balance 
the factoring time against the LL test time multiplied by the 
probability of finding a factor. Just how this is done (or "guessed 
at") I don't know, however having "partial" factorization attempts 
done already seems to muddy the water a bit, as does the extra 
time to double-check a LL test, as you say.

I think the real point is that trial factoring with the current program 
only works to 2^64 (approx. 20 decimal digits). ECM is expensive, 
running enough curves to be reasonably sure of finding all factors 
up to approx. 40 decimal digits takes *much* longer than running 
an LL test, and is in any case impractical for exponents 
>1,000,000 (until we get processors with much greater precision 
basic data types i.e. longer registers!)

> factoring rules!

Well, up to a point ... but just how long would it take you to *prove* 
that, e.g., Clarkson's Number (M3021377) is prime, by eliminating 
all possible factors?


Regards
Brian Beesley

Reply via email to