At 08:16 PM 12/26/2001 +0000, Brian Beesley wrote:

>On 25 Dec 2001, at 14:54, Steinar H. Gunderson wrote:
>
> > On Tue, Dec 25, 2001 at 01:41:31AM -0500, Paradox wrote:
> > >If the computer above could do each iteration in
> > >0.000000000000000000000000000000000000000000000000000000000000001 seconds,
> > >the amount of seconds required to complete the task would still be
> > >significantly more than 4,000,000 digits. Thats incomprehensible.
> >
> > Unless we find something more efficient than the Lucas-Lehmer test before
> > that. :-)

In all fairness - the prime our group just found required over thirteen 
million multiplications, each of which, by the best techniques in use 150 
years ago, would have required a human to write out & square a number of 
four million digits, thus doing over 1.6x10^13 multiplications of one digit 
by another, and a slightly smaller number of additions.

This, to the people of that time, would have required (very, very roughly) 
8x10^19 operations - not counting the modulus step at all.  It thus would 
take 2x10^20 or so person-seconds, done by the very few literate people 
completely without error, and somehow done in parallel.

There is no way they would have regarded that number as possible to prove 
prime.  Can we be certain that in another century or two people won't look 
back at us and write similar messages about how impossible we would find 
their latest primes?

>Testing to see if 2k(2^p-1)+1 divides 2^(2^p-1)-1 for small values of
>k _is_ feasible in a reasonable time and _may_ result in a positive
>refutation of the theory.
>
>As I (and others) have pointed out, LL testing on numbers of this
>sort of order is impossible (in this universe) because there simply
>isn't enough matter to build sufficient temporary storage- whatever
>speed we can manage to achieve.

The only possibility that occurs to me is to spread atoms out far enough to 
get many bits of information into their spacings and angles.  Of course, 
gravity would warp the structure, and the speed of light would put EXTREME 
restrictions on the maximum speed of computation.

>Theories based on samples involving "small numbers" of terms are
>always dangerous ... look, 3 is prime, 5 is prime, 7 is prime, 9 isn't
>prime but it does happen to be a perfect square, 11 is prime, 13 is
>prime ... it looks like all odd numbers which aren't perfect squares
>are prime ... or just look at the Fermat numbers ...

My personal favorite such 'theory':

All composite numbers have prime factors, all of which are one or more of 
the following:
1. Even primes
2. Mersenne primes
3. Fermat primes

Nathan Russell

_________________________________________________________________________
Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to