On 26/03/2018 10:34, Steven D'Aprano wrote:
On Mon, 26 Mar 2018 02:37:44 +0100, bartc wrote:

If I instead initialise C using 'C = int("288712...")', then timings
increase as follows:

Given that the original number given had 397 digits and has a bit length
of 1318, I must admit to some curiosity as to how exactly you managed to
cast it to a C int (32 bits on most platforms).

It is too big for an int, a long (64 bits), a long-long (128 bits) or
even a long-long-long-long-long-long-long-long-long-long-long-long-long-
long-long-long (1024 bits), if such a thing even exists.


So what exactly did you do?

I'm not sure why you think the language C came into it. I did this:

def fn():
    C = int(
    "28871482380507712126714295971303939919776094592797"
    "22700926516024197432303799152733116328983144639225"
    "94197780311092934965557841894944174093380561511397"
    "99994215424169339729054237110027510420801349667317"
    "55152859226962916775325475044445856101949404200039"
    "90443211677661994962953925045269871932907037356403"
    "22737012784538991261203092448414947289768854060249"
    "76768122077071687938121709811322297802059565867")

#   C = 2887148238050771212671429... [truncated for this post]

    D=C+C

for i in range(1000000):
    fn()

The purpose was to establish how such int("...") conversions compare in overheads with actual arithmetic with the resulting numbers.

--
bartc
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to