Hi,
At 03:23 PM 2/25/00 +0100, Reto Keiser wrote:
>parallel use of p-1 and trial factoring
>---------------------------------------
>
>Why can't we do first first the factorization up to n-2 bits (1/4) of
>the trial factoring time, then start the P-1 factoring up to 1/3 of the
>B1 value, after this, we can complete the trial factoring process and at
>the end we complete the P-1 (using the save file od intermediate file).
>(the parameters can be optimized)
I can't see any flaws in your reasoning, although it would be a bit unwieldy
to implement.
>no 68 bit factors
>-----------------
>
>until now >210 factors are found for 10megadiginumbers and more than 280
>exponents were factored up to 68 bits.
>Some (about 7) 67 digit factors were found but none with 68 bits.
My database has:
33219661 73867482830512390441
33223387 83006905661336745889
33221387 123317319076102495049
33235409 128314644111933147703
33238463 131707491089550166169
33230671 139408728702078150121
33224957 193425473534465274127
That's 6 67-bit factors and 1 68-bit factor. Not the expected
distribution, but
nothing to be concerned about yet either.
>organization of p-1 factoring
>-----------------------------
>
>A lot of factors of exponents between 10000 and 1000000 were found using
>the new P-1 method. Is there a database which contains which exponent
>were tested using which B1 and maybe a database od the save files?
All exponents from 20000 to 110000 were done with B1=1M and B2=40M
Exponents from 110000 to 600000 (still in progress) were done with
B1=100K and B2=4M. I still have the save files for exponents below 110000.
I think Alex has the save files for the larger exponents.
However, it must be pointed out that at some point you are better off switching
to ECM rather than expanding the P-1 bounds. I'm not sure what that point is.
Regards,
George
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers