#7013: [with patch, needs work] prime_pi and nth_prime
----------------------------------------+-----------------------------------
Reporter: kevin.stueve | Owner: kevin.stueve
Type: enhancement | Status: needs_work
Priority: major | Milestone: sage-4.4
Component: number theory | Keywords: primes, sieve, table,LMO
Author: Kevin Stueve | Upstream: N/A
Reviewer: was,robertwb,GeorgSWeber | Merged:
Work_issues: |
----------------------------------------+-----------------------------------
Comment(by kevin.stueve):
Replying to [comment:48 leif]:
> Replying to [comment:47 kevin.stueve]:
> > I updated our prime counting code to use denser tables from TOS, TRN,
and Kulsha, and an updated version of TOS's prime_sieve.c from Leif.
>
> Note that this version of {{{prime_sieve.c}}} is outdated (and btw, it
should be called {{{prime_pi_sieve_tos.c}}}).
>
> I hope I can provide a slightly polished version of the latest (Jan
2010, with optimizations, all overflow conditions I'm aware of - see #7539
- fixed) within this week.
I thought that I included your latest version. Maybe I messed up. =)
Please add a descriptive docstring with authors and changes to your new
post. Thanks again for your work. =)
>
>
> > Right now the interval [1e14,1e15] has spacing 1e10. It would be
nice, and not too difficult to sieve this interval with prime_sieve.c on
Sagemath with spacing 1e9. This would be good to compare against Andrew
Booker's new data he is going to calculate.
>
> Regarding the current density in other ranges, I'd consider this rather
a waste of time and space; even with - nowadays suboptimal - LMO, it takes
only a few seconds to compute pi(x) in that range.
I disagree with you on that point. 1e14(1e9)1e15 would be very useful.
Personally I would like much denser data. =)
>
> Above 10^19^ we have a spacing of 10^16^, which is far too sparse; I'm
currently computing pi(N*10^15^) for N>10000 (hopefully up to N=18446,
i.e. 2^64^, but this will take months...).
If you can tell me where the program is, I would like to sieve
1e14(1e9)1e15. Why not store much denser data than 1e19(1e15)2^64^? Why
not store 1e19(1e9)2^64^? Why not start at 0 (it would only add a
fraction to the computation time) and fill a 1TB drive with the densest
counts possible? If the cost of the sieving is the bottleneck, requiring
months of computer time, it might be worthwhile to invest in even larger
storage media. The data in my latest upload were the results of a decade
of work of TRN and 350 years of computation time by TOS, including time on
Kracken, one of the fastest supercomputers in the world. Every time you
want denser data, you have to sieve your entire interval again. If
expensive supercomputer time and months or years of computation are
involved, I think it makes much more sense to invest a fraction of your
computation cost in larger storage, and then store denser counts than you
would ever think anyone would need.
>
>
> >
http://sage.math.washington.edu/home/kstueve/uploads/March21_2010/TableBasedPrimePi.zip
>
> '''Please''' separate the data from the code... ;-)
>
>
> -Leif
--
Ticket URL: <http://trac.sagemath.org/sage_trac/ticket/7013#comment:50>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica,
and MATLAB
--
You received this message because you are subscribed to the Google Groups
"sage-trac" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/sage-trac?hl=en.