dineshv:
Thanks for that about Python3. My integers range from 0 to 9,999,999
and I have loads of them. Do you think Python3 will help?
Nope.
Bye,
bearophile
--
http://mail.python.org/mailman/listinfo/python-list
Yes, integer compression as in Unary, Golomb, and there are a few
other schemes.
It is known that for large (integer) data sets, encoding and decoding
the integers will save space (memory and/or storage) and doesn't
impact performance.
As the Python dictionary is a built-in (and an important
dineshv:
Yes, integer compression as in Unary, Golomb, and there are a few
other schemes.
OK. Currently Python doesn't uses Golomb and similar compression
schemes.
But in Python3 all integers are multi-precision ones (I don't know yet
what's bad with the design of Python2.6 integers), and a
dineshv wrote:
Yes, integer compression as in Unary, Golomb, and there are a few
other schemes.
It is known that for large (integer) data sets, encoding and decoding
the integers will save space (memory and/or storage) and doesn't
impact performance.
As the Python dictionary is a
Hi bearophile
Thanks for that about Python3. My integers range from 0 to 9,999,999
and I have loads of them. Do you think Python3 will help?
I want to do testing on my local machine with the large numbers of
integers and was wondering if I can get away with an existing Python
data structure or
If you store a large number of integers (keys and values) in a
dictionary, do the Python internals perform integer compression to
save memory and enhance performance? Thanks
Dinesh
--
http://mail.python.org/mailman/listinfo/python-list
Dinesh:
If you store a large number of integers (keys and values) in a
dictionary, do the Python internals perform integer compression to
save memory and enhance performance? Thanks
Define what you mean with integer compression please. It has several
meanings according to the context. For