P.L.,
Crunching on supercomputers, pioneering number theory, and having
colleagues like Richard Brent (who squarely fits the bill of people who want
Mersenne factors - not just knowing their primality) must be a blast!
It is my gut reaction, that incredibly amazing CPU advances like those
we've witnessed over the past 20 years, are the answer to all modern day
computational dilemmas - factoring today's "most wanted" composites to be no
exception. Your statement has made me look deeper. The difference between
factoring 2^211 and 2^311 seems trivial, but using trial factoring I
understand this difference equates to a billion-fold (2^30) increase in
computational resource. At that pace, CPU advancements haven't kept up!
The below mentioned algorithms truely, amazing, impress me.
It has just taken me a week, each!, on a cutting edge laptop to do trial
and P-1 factoring on a 100 million digit Mersenne. I know we do this
factoring to potentially shorten the primality test, but I also thought it
was this project's desire to either provide the least possible divisor, or
to say none exists to a given lower boundary. That is, if we have
by-product work beyond simple primality (no factors under 20 digits), then
let's verify it and be a source for the next project.
In my lifetime: man walked on the moon (and sent probes out of the solar
system); Fermat's 'big' theorem was solved; the human genome was mapped. I
look at expert's short-sightedness; RSA-100 was considered safe for a
lifetime. I look at distributed computing efforts, SETI with 3 million
participants, and understand it is in its infancy. I expect computer
architecture will allow human comparable AI in my lifetime. But will all
this focus increase today's computer factoring baseline by up to 2^466
(trial factoring multiple, what would the multiple be including today's
algorithms?) without algorithmic advancement? If I weren't 40...
I just read an article where someone said it he could see a billion
people, running machines with a million CPUS, each CPU running a million
times faster than today's - in ten years. All that, adds up to 2^75.
You're a smart man, and you have grounded my feet awhile. What say I
check-in again in 15 years.
Carleton
----- Original Message -----
From: <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, July 25, 2001 7:05 PM
Subject: Re: Mersenne: P-1
> > From: "CARLETON GARRISON" <[EMAIL PROTECTED]>
>
> > I honestly thought that the long term goal (maybe not by this panel but
for
> > others) was to factor all these numbers and that we were
setting/recording a
> > lower boundary for that effort.
> >
> > Carleton Garrison
>
> It will be a _long_ time before we are completely factoring those
> values of 2^n +- 1 which are now being subject to LL testing.
>
> In the 1983 Cunningham edition, (2^211 - 1)/15173 was the
> first incomplete 2^n +- 1 factorization.
> In the 1988 edition it was (2^311 - 1)/5344847.
> Now, in 2001, the exponent has risen to 641.
> This threshold exponent advanced 20 per year between 1983 and 1988,
> due primarily to the MPQS and ECM algorithms.
> Since 1988, it has risen about 25 per year, due
> primarily to the Number Field Sieve.
> Unless there are major algorithmic advances, don't
> expect to pass 2^2000 - 1 in our lifetimes.
>
>
>
>
>
>
> _________________________________________________________________________
> Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
> Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
>
_________________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers