Mersenne Digest         Tuesday, June 29 1999         Volume 01 : Number 590




----------------------------------------------------------------------

Date: Sun, 27 Jun 1999 22:32:01 -0400
From: "Rick Pali" <[EMAIL PROTECTED]>
Subject: RE: Mersenne: M38 in the news

From: Luke Welsh

> http://www.oregonlive.com/news/99/06/st062601.html

Interesting article, but this particular line took me by surprise: "And if
your computer doesn't find one, take heart. Kurowski plans to offer
financial compensation to participants in distributed Internet computing
networks within the next year or two."

Surely this is a misinterpreted quote.

Rick.
- -----
[EMAIL PROTECTED]
http://www.alienshore.com/

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Sun, 27 Jun 1999 22:36:07 -0400 (EDT)
From: Chip Lynch <[EMAIL PROTECTED]>
Subject: Re: Mersenne: M38 in the news

This is a pretty good article...  a few things I didn't know; someone
should add a history section to the FAQ, for those of us that haven't been
around too long.

;-)

On the other hand, there are a few things that could be polished up in the
article... to quote:

"Although in theory an infinite number of primes could be
 found, this latest discovery is only the 38th to be verified."

Close, anyway.

- ---Chip

On Sun, 27 Jun 1999, Luke Welsh wrote:

> http://www.oregonlive.com/news/99/06/st062601.html
> ________________________________________________________________
> Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
> 

       \\ ^ //
        (o o)
 ---oOO--(_)--OOo------------------------------------
| Chip Lynch            |   Computer Guru            |
| [EMAIL PROTECTED]       |                            | 
| (703) 465-4176   (w)  |   (202) 362-7978   (h)     |
 ----------------------------------------------------

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 10:17:50 +0100
From: Robin Stevens <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Linux VFAT question

On Sun, 27 Jun 1999 11:45:37 -0400, Pierre Abbat <[EMAIL PROTECTED]> wrote:
> > But the fact is that the performance of my mounted hdd win98 (vfat)
> > partition is very low. All my normal linux (ext2) partitions work with
> > normal performance. 
> 
> cd to various directories, then type vdir|wc -l in each. This tells you
> how many files the directory has. The more files, the longer it takes to
> sort them.  vdir -U lists the directory without sorting it, which is a
> little faster.
 
I too find that performance on fat/vfat partitions is low compared to ext2
partitions, but I don't think that's too surprising: the kernel is
understandably optimised for ext2, and FAT partitions are hardly the most
efficient of filesystems in the first place.  To be honest, I don't see a
lot of difference in performance on fat partitions whether or not mprime
is running.

> mprime just writes once every 30 minutes, so whether it's running in the
> Windows drive or the Linux drive is inconsequential. As to the swapping,
> the swap daemon runs at 12 naughty, so I don't see why mprime at 20 nice
> would bother it at all.

On one of my machines, disk performance appears significantly worse with
mprime running than when it's switched off.  On my P200MMX, performance on
a Quantum Fireball ST4.3A drops from about 8.25MB/s to 4.9MB/s (measured
using hdparm).  However the performance of the Seagate drive on the same
machine and the Fujitsu drive on my PII-266 is barely affected by mprime.
Still, I'd rather lose a little on disk performance than waste over 96% of
my workstation's CPU cycles :-)

One write every 30 minutes is hardly going to trouble things much.  I'd far
rather have the machine doing regular disk writes than risk losing several
weeks' (your uptime may vary) work if mprime is not terminated neatly.

- -- 
- -------------------- Robin Stevens <[EMAIL PROTECTED]> -------------------- 
Merton College, Oxford OX1 4JD, UK   http://www-astro.physics.ox.ac.uk/~rejs/ 
(+44) (0)1865: 726796 (home) 273337 (work) 273390 (fax)    Pager: 0839 629682
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 08:55:24 -0700
From: "Jonathan Zylstra" <[EMAIL PROTECTED]>
Subject: Mersenne: Re: 10,000,000 digit prime

The 10,000,000 digit prime would have an exponent of over
3010299.956, or 3010300
which is found by taking (log 2 * 10,000,000)

J. Zylstra

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 12:20:55 -0500 (CDT)
From: Conrad Curry <[EMAIL PROTECTED]>
Subject: Mersenne: M617 Factored

                       M617 Factored
                       -------------

    M617 has been completely factored by the Special Number
Field Sieve (SNFS).  It was previously known that

  M617 = 59233 *
         68954123297 *
         c171

    The p5 was found by Riesel in 1957 and the p11 found by
Brent and Suyama in 1981.  The c171 is a 171 digit composite
factor given by

  c171 = 133162933696720252644109076239739315294641129598\
         571214674268232878869403201703608966454713865163\
         367575359986404237817149731676559992850220509804\
         718874844608767326097407871

    On June 26,1999 it was found that c171 = prp51 * prp120,
where

  prp51  = 1577519781152253854956475324210064781272294056\
           44601

  prp120 = 8441284558692203904329647194935146184026400044\
           2557390949549814809580375481579281829664306776\
           0548829906291314807571121271

    The factorization of M617 was 'Most' wanted by the
Cunningham project[1] and is the smallest number of the form
2^n-1 whose complete factorization was not known.  That
distinction now goes to M619.

    The sieving was done by a group of 19 volunteers starting
May 7, 1999 and finishing on June 16, 1999.  A total of
15986534 relations was collected requiring about 1.2 Gbytes
of memory in uncompressed ASCII format.  The resulting matrix
was 1563691 x 1566436.  The linear algebra and square-root
phases were done at Centrum voor Wiskunde en Informatica (CWI)
by Peter Montgomery.

    Acknowledgments are due to the volunteer sievers

  Pierre Abbat                  Ricardo Aguilera
  Brian Briggs                  Gary Clayton
  David Crandell                Conrad Curry
  Kelly Hall                    Philip Heede
  Jim Howell                    Skip Key
  Alex Kruppa                   Samuli Larvala
  Don Leclair                   Ernst Mayer
  Thomas Noekleby               Henrik Oluf Olsen
  Marcio de Moraes Palmeira     Guillermo Ballester Valor
  Paulo Vargas

    Special thanks to Bob Silverman, Peter Montgomery, Alex
Kruppa, Don Leclair and Ernst Mayer.  Also to CWI, the Department
of Computer Sciences at the Technical University Munich and the
School of Mathematical Sciences at the University of Southern
Mississippi for the use of their computers.

    M619 is currently sieving by SNFS.  It is on the Cunningham
projects 'More' wanted list.  If you would like to donate some
of your CPU time visit [2] and download the siever for DOS or
linux.  Source code is available on request to compile on other
platforms.  You will need at least 21 Mbytes of free physical
memory.

[1] http://www.cs.purdue.edu/homes/ssw/cun/index.html
[2] ftp://ftp.netdoor.com/users/acurry/nfs

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 14:19:35 -0400
From: Jeff Woods <[EMAIL PROTECTED]>
Subject: Re: Mersenne: A few questions

At 07:08 AM 6/28/99 +0100, you wrote:

> > I *suspect* that in light of GIMPS' success, he is likely looking much
> > higher than we are now (and has been for some time, again as a 
> guess).   He
> > knows our current program cannot top P > 20,000,000, so I suspect he's
> > above that range, perhaps by a good margin.  It may be that he breaks 
> GIMPS
> > record(s) again someday.
>
>Are you sure about this?

By no stretch of anyone's imagination.   As I said, i don't know David from 
Adam, and have never talked to him, even by Email.   My speculation was a 
SWAG, and could be 180 degrees from the truth.   Your own information about 
his ability to get CPU time does tend to contradict my guesses, which were 
based on the heresay I noted below.

>George told me that he'd asked David S. to
>verify the newly-discovered prime, but that David was having
>difficulty getting access to sufficient CPU time & would only be able
>to run in the background, taking approx. 4 weeks. Several others of
>us would be able to do an independent double-check run on a 6 million
>range exponent in that time, e.g. I could use MacLucasUNIX on my 533
>MHz Alpha 21164 & complete in 4 weeks (ish). Which is why I contacted
>George, to offer my services.
>
> >  I do know that the last time George expanded
> > Prime95 to hunt up to 20MM (up from 3MM), that George sent several 
> residues
> > in the upper ranges to David for confirmation, and that David was able to
> > confirm SOME of them rather quickly (faster than a Cray could do the
> > calculations on the spot, so they were already tested).  This to me
> > indicates that David is searching the numbers far above our top potential
> > range, especially since a Cray can test such numbers in about a week or
> > two, as a guess.
>
>Well - for testing the algorithm you don't neccessarily have to run
>to completion - I've also run quite a number of exponents between 10
>and 20 million to 400 iterations for the purpose of cross-checking
>results. And I'll be doing a lot more when I get sufficient memory.

While not CERTAIN, I'm fairly sure that the test values David confirmed 
were of "last iteration" residues.

>Verification of 3021377 took DS 8 hours on "his" Cray. A "square law"
>seems reasonable (run time = #iterations * time per iteration, time
>per iteration is O(n log n) where n is the FFT size) so, 20 million
>being approx. sqrt(50) times 3 million, run times of 50 * 8 hours =
>2.5 weeks would be expected - assuming unlimited access to the whole
>system. For exponents >33.22 million (10 million digit numbers) he'd
>be well into months per test.

YOWCH!   How long on a P-III/450, or an Athlon (K7), then?
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 21:59:20 +0100
From: Gordon Spence <[EMAIL PROTECTED]>
Subject: Mersenne: LL & Factoring DE Crediting

The GIMPS home page explains the following

"Finally, if a factor is later found for a Mersenne number or the
Lucas-Lehmer result is found to be incorrect, then you will "lose credit"
for the time spent running the test."

It is on <www.mersenne.org/top.htm> always struck me as odd I must admit.
At the time the test was done it obviously WAS required, so why the
subsequent penalty when we make advances and can factor much deeper?

regards

Gordon


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 16:10:48 -0500
From: "Griffith, Shaun" <[EMAIL PROTECTED]>
Subject: Mersenne: RE: Mersenne Digest V1 #584

[On the off chance that no one else has replied to this since it was
received...]

On Sat, 19 Jun 1999 12:55:46, Jeff Woods wrote:

<snip>

Starting at about M13, you see that there are indeed islands:

<snip some more>

Given the strong linearity of "log(exponents of mersenne primes)", it is not
surprising that the averages of consecutive pairs or triples will also be
linear. Indeed, taking the "centers" of the islands as data points will
actually produce a stronger correlation than the original data.

If I go back and average consecutive exponents (well, the log of the
exponents, anyway), the correlation with a straight line improves from .9925
to .9935 (compared to the original data). If I get to throw away the ones
that don't "fit" according to some arbitrary criterion, it may improve
marginally. [Note: using your choices actually reduces the correlation to
.9849!]

So perhaps we should remember Mark Twain: "There are three kinds of lies:
lies, damn lies, and statistics."

- -Shaun
Quantum Mechanics: The dreams stuff is made of
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 23:42:24 +0200
From: "Steinar H. Gunderson" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: A few questions

On Sun, Jun 27, 1999 at 09:08:15PM -0400, George Woltman wrote:
>Assuming they respond that the claim
>is all in order, then we should be able to announce shortly thereafter.

I'm keeping my fingers, toes and hairs crossed :-) Just too bad nobody
else has participated in my guess-contest... That means I will be the
sole winner! Hooray!

Seriously, though, perhaps Slashdot will post the news when the exponent
is official. I've already submitted it three times...

/* Steinar */
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 23:43:33 +0100
From: "Brian J. Beesley" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting

On 28 Jun 99, at 21:59, Gordon Spence wrote:

> "Finally, if a factor is later found for a Mersenne number or the
> Lucas-Lehmer result is found to be incorrect, then you will "lose credit"
> for the time spent running the test."

I obviously can't answer for George. PrimeNet works a different way, 
Scott credits any work submitted in good faith, whatever happens 
thereafter. I think George's idea was to encourage people to do the 
appropriate amount of trial factoring - if you didn't do enough, and 
proceed to run the LL test, you lose the credit because you should 
have done more factoring to start off with. Otherwise you maximize 
your credit by simply running LL tests & ignoring factoring 
altogether.

The problem with this approach is that goalposts shift. Different 
processor types have different LL test:factoring performance ratios, 
in particular the "plain Pentium" (Intel Pentium "classic" and Socket 
7 MMX processors) are rather poor at factoring compared with most 
other processor types, and processors with high core:bus speed ratios 
are going to be (relatively) better at factoring because of the lower 
memory bus load. Consequently the appropriate factoring depth varies 
to some extent depending on the mix of processors in use.
> 
> It is on <www.mersenne.org/top.htm> always struck me as odd I must admit.
> At the time the test was done it obviously WAS required, so why the
> subsequent penalty when we make advances and can factor much deeper?
> 
I agree. However, to take the point to a ridiculous extreme, finding 
a factor saves running a LL test - so why can't I have credit for 
finding 54,522 (small) factors in the range 33.2 million to 36 
million, thus saving (very approximately) 54,522 * 8 P90 CPU years LL 
testing? The job ran in an hour on a PII-350!

Somewhere in here there is a compromise. I'd suggest:

(a) you should lose _double_ credit for a LL test if the result is 
proved incorrect, or if a factor is found in a range which you claim 
to have checked;
(b) a table should be published of the suggested factoring depths for 
various exponent ranges. If you submit a LL test result, then someone 
finds a factor less than the table depth at the date you submitted 
the result, you lose the credit; otherwise, you keep the credit;
(c) however deep you factor, you only get credit for factoring to the 
table depth in the current table.

BTW I don't really understand why PrimeNet doesn't credit you for 
results obtained when the manual testing pages are used instead of 
the automatic system. I suppose Scott is trying to promote the 
automatic system, and I'm in agreement with that sentiment, but it 
still seems a bit strange to me.

Regards
Brian Beesley
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 19:03:59 -0400
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: Mersenne: Status estimate

The Status estimate of the chance that the number you are testing seems to
be off by a factor of e, based on Wagstaff's estimate.  

+----------------------------------------------+
| Jud "program first and think later" McCranie |
+----------------------------------------------+


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 20:24:46 -0400
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting

At 09:59 PM 6/28/99 +0100, Gordon Spence wrote:
>The GIMPS home page explains the following
>
>"Finally, if a factor is later found for a Mersenne number or the
>Lucas-Lehmer result is found to be incorrect, then you will "lose credit"
>for the time spent running the test."
>
>It is on <www.mersenne.org/top.htm> always struck me as odd I must admit.

It isn't very clear, but the purpose may have been to discourage bogus positive
results, because they will eventually be double-checked.

+----------------------------------------------+
| Jud "program first and think later" McCranie |
+----------------------------------------------+


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 20:51:13 -0400
From: George Woltman <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting

Hi all,

At 09:59 PM 6/28/99 +0100, Gordon Spence wrote:
>if a factor is later found for a Mersenne number or the
>Lucas-Lehmer result is found to be incorrect, then you will "lose credit"
>for the time spent running the test."
>
>It is on <www.mersenne.org/top.htm> always struck me as odd I must admit.
>At the time the test was done it obviously WAS required, so why the
>subsequent penalty when we make advances and can factor much deeper?

The explanation is really simple.  It saves me from having to keep track
of too much data (the bad results and the ones that a factor was later
found).

However, since the v17 bug, I've had to modify my procedures anyway and
give credit for those incorrect results.  I suppose it wouldn't be too
hard to do the same for factors found and LL results proven incorrect.

Regards,
George



________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 21:04:16 -0400
From: Peter Doherty <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Status estimate

At 19:03 06/28/1999 -0400, you wrote:
>The Status estimate of the chance that the number you are testing seems to
>be off by a factor of e, based on Wagstaff's estimate.  
>
>+----------------------------------------------+
>| Jud "program first and think later" McCranie |
>+----------------------------------------------+ 


Yeah, I was wondering about that thing...if we don't even know if there are
finite or infinite Mersenne primes, how does it generate that number?  Mine
says my odds are about 1 in 60,000 or so....I'm doing an LL test on a
number in the 719 range, and back when I was testing numbers in the 400
range the odds were much better...around 1 in 45,000.


- --Peter

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 19:45:11 -0700
From: Eric Hahn <[EMAIL PROTECTED]>
Subject: RE: Mersenne: A few questions

>How large will the exponent be for a 10,000,000 digit prime number?

To be a 10,000,000 digit prime number the exponent must be at least
33,219,281 (which also happens to be a Mersenne candidate).

>Has the prime number that was found a week ago been announced on this
>list?
>I.E.  What number was it?

It hasn't been announced yet... but from what little information 
that is available, i.e. The Oregonian newspaper article, the
exponent must be =at least= 6,643,859.

Eric

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 23:03:26 -0400 (EDT)
From: "David A. Miller" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting

Brian J. Beesley writes:

>thereafter. I think George's idea was to encourage people to do the 
>appropriate amount of trial factoring - if you didn't do enough, and 
>proceed to run the LL test, you lose the credit because you should 

This doesn't fit with my understanding of how Prime95 works. I though that
the program automatically chose the factoring depth using a formula that
weighs the chance of finding a factor against the time that would be
required for the LL test. If there is a way for the user to control the
amount of factoring, then it is news to me.


David A. Miller
Rumors of my existence have been greatly exaggerated.

________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 20:21:29 -0700
From: Eric Hahn <[EMAIL PROTECTED]>
Subject: RE: Mersenne: A few questions

CORRECTION TO LAST MESSAGE!

The exponent 33,219,281 happens to be a Mersenne prime candidate!


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Mon, 28 Jun 1999 22:25:24 -0700
From: "Scott Kurowski" <[EMAIL PROTECTED]>
Subject: Mersenne: PrimeNet Stats Updated

PrimeNet's updated CPU stats chart is at
http://entropia.com/ips/stats.html

The last chart update was mid February.  There are a handful of days that
reached 800 gigaflop rates, and a nice view of the rate recovery from the v17
bug.  A linear trend is no longer the best fit.

Regards,
scott


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 04:16:19 -0400 (EDT)
From: Lucas Wiman  <[EMAIL PROTECTED]>
Subject: Mersenne: Mersenne FAQ 1.1

All,
Here is version 1.1 of the FAQ.
I corrected a few typos.  I then added 500 more of them when I added the LL
section.  The LL section needs major revision, and clarification, especially
the repeating LL part.  I think we might add a section on FFT, and DWT, though
I don't know enough about math to do it.  I also like the idea of a history 
section.  I have read something about the history of GIMPS, as well as some of
the list's archives, but I have only been here since February...

- -Lucas Wiman



Q: What are mersenne numbers?
A: Mersenne numbers are numbers of the form 2^p-1, (2 to the pth power minus 1)
Generally, when mersenne numbers are mentioned, mersenne numbers with prime
exponents are what is actually meant.

Q: What is a mersenne prime?
A: A mersenne prime is a mersenne number Mp which is prime.  P must itself
be prime, otherwise Mp has a trivial factorization.

Q: How can we prove these massive numbers are prime?
A: We can use a method known as the Lucas-Lehmer (LL) test.  It works like
this:
For p odd, the Mersenne number 2^p-1 is prime if and only if 2^p-1 divides
S(p-1) where S(n+1) = S(n)^2-2, and S(1)=4

Q: Won't those S(n) get huge very quickly?  How can we check whether 2^p-1 
divides S(p-1)?
A: Yes, and with the magic of modular arithmetic.  
Modular arithmetic deals with remainders from long division (like you did in
grade school).  We say that if a==b mod c if a and b both have the same 
remainder when divided by c (which is read "a is congruent to b mod c," on 
paper, we would write this a=b mod c where = has 3 horizontal lines.)
Another way to think of this is (a-b)/c is an integer.  Here are the most
important things involving modular arithmetic:

(1) if a==0 mod c then a is divisable by c
(2) (a+b) mod c = (a mod c)+(b mod c) mod c
(3) (a*b) mod c = (a mod c)*(b mod c) mod c
(4) a^b mod c = (a mod c)^b mod c

Now, we can define a new series (call it s(n)) where
s(1)=4, and s(n+1)=s(n)^2-2 mod (2^p-1).
Using these rules of modular arithmetic, we can see that 2^p-1 divides S(p-1)
if and only if s(p-1)=0.  Therefore, the numbers we work with can never get
above (2^p-1)^2 (no bigger than 2^p-1 with some cleverness).

Q: If the remainders of the LL series ever repeated, we could skip the rest of 
the test, because it could never equal 0.  Why don't we test for this?
A: Well, a simple heuristic argument involves simple probabilities.  
Think about this, for the recurrance to be important, it must occurr in the
first p-1 iterations.  *BUT* there are 2^p-1 possible remainders, that means
that the that the probability of such a recurance is aproximatly (p-1)/(2^p-1).
These odds are better than suddenly being transported to Mars by quantum 
fluctuations, but worse than winning all the lotteries in the world
simultaniously. (well, of course depending upon the size of p, when it gets
large enough, the probability can get arbitrarily close to zero)

Here's another way to think of it:
as the S-sequence is defined above, 
S(n)=L(2^n) (where L(n)=(2+sqrt(3))^n+(2-sqrt(3))^n)

Now suppose for instance the recurance occurred between L(2^(n-3)) and 
L(2^(n-2)) - which, even assuming the "first" recurrence is equally likely 
anywhere, would occur with probability 50%. The S-sequence would not even see 
it.

And furthermore:
    We can illustrate working modulo M23 = 8388607 = 47 * 178481.
3 is a quadratic reside modulo 47 (e.g, 12^2 == 3 mod 47),
so 2 + sqrt(3) is in the base field of order 47.
The multiplicative order of 2 + sqrt(3) divides 47-1, and turns out to be 23.

    3 is a quadratic nonresidue modulo q = 178481.  The order of 2 + sqrt(3)
divides q + 1 = 178482 = 2 * 3 * 151 * 197 and turns out to be 178482.

    The least common multiple of these orders is
2 * 3 * 23 * 151 * 197.  So L(2 * 3 * 23 * 151 * 197) == L(0) mod M23.

    For the L sequence, we need two powers of 2 whose difference is
divisible by 2 * 3 * 23 * 151 * 197.
The orders of 2 modulo 3, 23, 151, 197 are 2, 11, 15, 196, respectively.
The order of 2 modulo 3 * 23 * 151 * 197 is the least common multiple
of these orders, namely 2^2 * 3 * 5 * 7^2 * 11 = 32340.
To include a factor of 2, we need L(2^32341) == L(2^1) mod M23.
That is, S(32341) == S(1) mod M23.
This is far more than 23 LL steps before the sequence repeats.
(Thankyou, Chris Nash, and Peter Lawrence Montgomery)

Q: Wouldn't it make more sense to record the S (without taking the remainder) 
series, and save the result, thereby saving alot of work?
A:  No.  The number of digits in the S series doubles with each iteration.
thus by the 270th iteration, you've run out of protons in the universe to 
store it in.

Q: If some s(n) is larger than half the size of the mersenne number, wouldn't
it make sense to take 2^p-1-s(n) and square that and subtract 2?
A: This would usually take off one bit.  When there's seven million of them,
it really doesn't make much difference, and it would take longer to determine
if s(n) is greater than 2^(p-1) than the time gained from 1 or 2 bits.  
Furthermore, this would slow down all the other iterations, which is
something we really don't want...

Q: How is factoring done on mersenne numbers?
A: We check a potential factor n of Mp by seeing if 2^p (mod n)=1

Q: But doesn't this mean using very large numbers, and computing 2^p?
A: Not at all, there is a very simple algorithm for finding a^b (mod c),
and here's how it works:
we set res=1,
then multiply res by a^(2^n) mod c whenever the nth binary digit of b is 1
a^(2^n) mod c is relativly easy to calculate through repeated sqaring of
a mod c. 

Q: That's still a lot of factors!!!
A: Well we can significantly reduce the number of factors through some
simple theorems which state that:
(1) any factor of 2^p-1 must be of the form 2*k*p+1 for k>0
(2) any factor of 2^p-1 must be of the form 8*n+1 or 8*n-1 for n>0
(3) the smallest factor (other than 1) of any number must be prime
Theorem (1) reduces the number of cases to 1/(2*p) of all numbers.
Theorem (2) reuduces those to 1/2 of those.
Theorem (3) can reduce the factors down to only prime numbers, but that takes
too much time.
It is generally better simply to sieve out potential factors with very small
factors.

Q: But how can you sieve out small factors from potential factors?
A: Well, I think an example would be most illistrative.
Say we want to seive out all potential factors divisable by 3, 5, or 7.
Well we can create an array of 3*5*7=105 elements, with all elements set
to 1 (we'll call this array N).  Then elementate all N[i] such that i is
divisable by 3, 5, or 7.  Then we can keep a running value of 2*k*p+1 mod 105
then if N[2*k*p+1 mod 105]=0 we skip that factor, and go on to the next one.

Q: How much does all this actually help?
A: The sieving for small factors (up to 17), as well as theorem (2) 
leaves only about 32% of potential factors of the form (2*k*p+1).

Q: Wouldn't it be more effiecient to check prime numbers and see if they
divide as certain 2^p-1?
A: Yes and no.  It is theoretically more effiecient to do this, however 
this also produces uninteresting (and relativly useless) results for where p
itself is composite.  

Q: A prime number cannot divide more than one mersenne number, so why not
create a table of all the prime numbers which divide mersenne numbers so
as not to duplicate work?
A: This isn't really workable because it would take longer to check the
table than it would to just check the factor.

    


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 12:12:28 +0200
From: Sturle Sunde <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting 

> The GIMPS home page explains the following
> "Finally, if a factor is later found for a Mersenne number or the
> Lucas-Lehmer result is found to be incorrect, then you will "lose credit"
> for the time spent running the test."
> 
> It is on <www.mersenne.org/top.htm> always struck me as odd I must admit.
> At the time the test was done it obviously WAS required, so why the
> subsequent penalty when we make advances and can factor much deeper?

I have always liked Georges way of calculating credits.  I think it is 
very nice and clean.  It also encourages factoring, and the factors of 
mersenne numbers interest mathematicans.  It _gives_ credit for 
factoring, by giving the one who finds factors a relative advantage 
over those who spend less time factoring.  If you find a factor of a 
number which is tested already, you climb by pushing someone else down.  
If I don't factor far enough, that will eventualy happen to me.  This 
keeps my {3,4}86's, m68k's, mips3k's, and old sparc's busy. 8-)

We can discuss different ways of counting CPU-time, etc. for ever, but 
we will never find the Correct Formula.  Therefore I think that Georges 
formula, just counting LL-results for every Mersenne without a factor 
in the database and give credit to the people who tested those numbers, 
is a beautiful solution.  A simple, working solution to a problem which 
can be made very complex, time- and space consuming.

Scott's formula is also a simple, working solution which suits Primenet.


- -- 
Sturle   URL: http://www.stud.ifi.uio.no/~sturles/   Er det m}ndag i dag?
~~~~~~   MMF: http://www.alladvantage.com/go.asp?refid=BUP399  - St. URLe


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 15:08:56 +0200
From: "Steinar H. Gunderson" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Re: 10,000,000 digit prime

At 15:55 28.06.99 +0000, Jonathan Zylstra wrote:
>The 10,000,000 digit prime would have an exponent of over
>3010299.956, or 3010300
>which is found by taking (log 2 * 10,000,000)

Actually, it's log10(2) * 10,000,000, which is a different number. Of
course, since I'm not at home, I can't figure out _that_ number offhand,
but see the posts from some weeks back to get the exact first exponent.

/* Steinar */


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 09:09:10 -0400
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Mersenne FAQ 1.1

At 04:16 AM 6/29/99 -0400, Lucas Wiman wrote:
>All,
>Here is version 1.1 of the FAQ.

Here's a question that needs to be addressed: how to go from digits to
exponents, and exponent to digits.

+----------------------------------------------+
| Jud "program first and think later" McCranie |
+----------------------------------------------+


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 09:39:36 -0400
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Mersenne FAQ 1.1

At 04:16 AM 6/29/99 -0400, Lucas Wiman wrote:
>All,
>Here is version 1.1 of the FAQ.

Also. FAQs involve why do we think there are an infinite number of Mersenne
primes, how many are expected below a given limit, and what s the probability
of finding one.

+----------------------------------------------+
| Jud "program first and think later" McCranie |
+----------------------------------------------+


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 09:37:56 -0400
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Status estimate

At 09:04 PM 6/28/99 -0400, Peter Doherty wrote:
>>The Status estimate of the chance that the number you are testing seems to
>>be off by a factor of e, based on Wagstaff's estimate.  
>>
>Yeah, I was wondering about that thing...

What I'm saying is that if p is prime, the best estimate that Mp is prime is
2.5695 * log(ap)/p, where a=6 if p == 1 mod 4, a = 2 if p == 3 mod 4.  For
example, for exponents around 7,000,000 this gives probabilities of 6.44E-6 and
6.04E-6 respectively, or 1/155200 and 1/165600.  The figures that Status gives
for the exponents I'm working on are precisely a factor of 'e' higher.

+----------------------------------------------+
| Jud "program first and think later" McCranie |
+----------------------------------------------+


________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 23:15:49 +0200
From: "Steinar H. Gunderson" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting

On Mon, Jun 28, 1999 at 11:03:26PM -0400, David A. Miller wrote:
>If there is a way for the user to control the
>amount of factoring, then it is news to me.

It's in the `Advanced/Factor' menu choice. mprime hasn't got this option,
but I haven't bothered to send in a bug report.

/* Steinar */
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 23:13:26 +0200
From: "Steinar H. Gunderson" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: LL & Factoring DE Crediting

On Mon, Jun 28, 1999 at 11:43:33PM +0100, Brian J. Beesley wrote:
>I obviously can't answer for George. PrimeNet works a different way, 
>Scott credits any work submitted in good faith, whatever happens 
>thereafter.

More important, Scott credits factoring work also. Since George didn't
credit factoring work, I guess he set up these rules to discourage
LL-testing only (ie. skipping the factorization).

>I agree. However, to take the point to a ridiculous extreme, finding 
>a factor saves running a LL test - so why can't I have credit for 
>finding 54,522 (small) factors in the range 33.2 million to 36 
>million, thus saving (very approximately) 54,522 * 8 P90 CPU years LL 
>testing? The job ran in an hour on a PII-350!

I'm sure that if you asked Scott, he would credit that to you as factoring
work.

However, while a factor `saves' an LL test, this is expected behaviour,
and not something extraordinary. If every factor was going to be credited
as a full LL test, most people would do factoring only! I belive PrimeNet's
solution on this is close to optimal.

>(a) you should lose _double_ credit for a LL test if the result is 
>proved incorrect, or if a factor is found in a range which you claim 
>to have checked;

Why? I'd rather stick with the PrimeNet policies, where you never lose
any credit at all.

>BTW I don't really understand why PrimeNet doesn't credit you for 
>results obtained when the manual testing pages are used instead of 
>the automatic system. I suppose Scott is trying to promote the 
>automatic system, and I'm in agreement with that sentiment, but it 
>still seems a bit strange to me.

There are always technical reasons, you know :-) But honestly, I don't
have a clue on this.

/* Steinar */
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 23:14:48 +0200
From: "Steinar H. Gunderson" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: A few questions

On Mon, Jun 28, 1999 at 07:45:11PM -0700, Eric Hahn wrote:
>It hasn't been announced yet... but from what little information 
>that is available, i.e. The Oregonian newspaper article, the
>exponent must be =at least= 6,643,859.

Hmmm, my guess was at about 6,2 million, but nobody else guessed,
so there :-)

/* Steinar */
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

Date: Tue, 29 Jun 1999 23:17:20 +0200
From: "Steinar H. Gunderson" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: PrimeNet Stats Updated

On Mon, Jun 28, 1999 at 10:25:24PM -0700, Scott Kurowski wrote:
>The last chart update was mid February.  There are a handful of days that
>reached 800 gigaflop rates, and a nice view of the rate recovery from the v17
>bug.  A linear trend is no longer the best fit.

Then what is the best fit? Exponential? :-) Or even better (as I've heard
some people use -- I don't remember the name, though), an exponential
curve, where the exponent grows exponentially? (Well, OK, that is perhaps
a bit _too_ optimistic.)

/* Steinar */
________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm

------------------------------

End of Mersenne Digest V1 #590
******************************

Reply via email to