Hi,

...and Singular (via Martin Albrecht's wrapping) is quite good
for this calculation:

sage: R.<x,y> = QQ[]
sage: time f = (1+x)^(2^13-1)
CPU times: user 0.01 s, sys: 0.06 s, total: 0.07 s
Wall time: 0.07
sage: R.<x,y> = GF(389)[]
sage: time f = (1+x)^(2^13-1)
CPU times: user 0.00 s, sys: 0.00 s, total: 0.00 s
Wall time: 0.00

MAGMA:

R<x,y> := PolynomialRing(RationalField(),2);
time f := (1+x)^(2^13-1);
... I give up after waiting for a minute!

But for a univariate poly it is quite good:
> R<x> := PolynomialRing(IntegerRing());
> time f := (1+x)^(2^13-1);
Time: 0.110
(Note additional timings very a lot -- some are around 0.06s.)


On 8/27/07, Bill Hart <[EMAIL PROTECTED]> wrote:
>
> I wrote up a ridiculously naive polynomial powering function in FLINT.
> Here is a basic FLINT program for computing the above powers:
>
>     fmpz_poly_t poly, power;
>     fmpz_poly_init(power);
>     fmpz_poly_init(poly);
>     fmpz_poly_set_coeff_ui(poly, 0, 1);
>     fmpz_poly_set_coeff_ui(poly, 1, 1);
>     fmpz_poly_power(power, poly, (1UL<<13));
>
> Here are the times:
>
> real    0m1.190s
> user    0m0.960s
> sys     0m0.232s
>
> If I replace 2^13 with 2^13-1 I get:
>
> real    0m0.758s
> user    0m0.628s
> sys     0m0.128s
>
> I'm sure there's plenty we can do to speed that up. For a start the
> system time could trivially be all but wiped out by allocating all the
> required memory up front. There's probably also some FFT caching I
> could do but haven't.
>
> Polynomial evaluation should probably be used beyond some point, but
> we haven't implemented that yet.
>
> Bill.
>
>
> On 27 Aug, 18:32, [EMAIL PROTECTED] wrote:
> > I was recently contacted by Niell Clift, who is arguably the foremost 
> > expert on addition chains.  Though he's most concerned with computing 
> > minimal addition chains, which aren't always optimal and can take a 
> > ridiculous amount of time to compute, I believe that some of the work that 
> > he's done can be used to construct a rather generic addition chain package. 
> >  I don't expect to beat numeric exponentiation, but polynomial 
> > exponentiation seems oddly slow.
> >
> > Already, there seems to be a cutoff point where my work on addition chains 
> > can be used to improve the speed of exponentiation.
> >
> > (x^n)(x+1) constructs the polynomial x^n and evaluates it (this uses my 
> > polynomial evaluation code)
> >
> > The for loop performs binary exponentiation (I pick 2^13 and 2^13-1 to make 
> > this easy).  This puzzles me -- binary exponentiation in python currently 
> > beats the pants off of whatever is getting used for the polynomials.  What 
> > gives?
> >
> > sage: x = polygen(ZZ)
> > sage: n = 2^13
> >
> > sage: time a = (x+1)^n
> > CPU time: 16.82 s,  Wall time: 16.82 s
> >
> > sage: time a = (x^n)(x+1)
> > CPU time: 2.47 s,  Wall time: 2.47 s
> >
> > sage: %time
> > sage: z = x+1
> > sage: for i in range(13):
> > ...       z = z*z
> > CPU time: 2.46 s,  Wall time: 2.46 s
> >
> > sage: n = 2^13-1
> >
> > sage: time a = (x+1)^n
> > CPU time: 3.66 s,  Wall time: 3.66 s
> >
> > sage: time a = (x^n)(x+1)
> > CPU time: 0.91 s,  Wall time: 0.91 s
> >
> > sage: %time
> > sage: z = x+1
> > sage: y = z
> > sage: for i in range(12):
> > ...       z = z*z
> > ...       y*= z
> > CPU time: 1.61 s,  Wall time: 1.61 s
>
>
> >
>


-- 
William Stein
Associate Professor of Mathematics
University of Washington
http://www.williamstein.org

--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~----------~----~----~----~------~----~------~--~---

Reply via email to