New submission from Tom Goddard <godd...@cgl.ucsf.edu>:

In Python 2.7, random.seed() with a string argument is documented as being 
equivalent to random.seed() with argument equal to the hash of the string 
argument.  This is not the actual behavior.  Reading the _random C code reveals 
it in fact casts the signed hash value to unsigned long.  This also appears to 
be the situation with Python 2.5.2.  Rather than fix this in 2.7.1 it seems 
preferable to just correct the documentation in 2.7.1 to preserve backward 
compatibility.  Bug #7889 has already addressed this problem in Python 3.2 by 
eliminating the use of hash() for non-integer random.seed() arguments.  I 
encountered this problem while trying to produce identical sequences of random 
numbers on 64-bit architectures as on 32-bit architectures.

Here is a demonstration of the bug in Python 2.7, 32-bit.

random.seed('1pov')
random.uniform(0,1)
0.713827305919223

random.seed(hash('1pov'))
random.uniform(0,1)
0.40934677883730686

hash('1pov')
-747753952

random.seed(hash('1pov') + 2**32)  # unsigned long cast
random.uniform(0,1)
0.713827305919223

----------
components: Library (Lib)
messages: 117988
nosy: goddard
priority: normal
severity: normal
status: open
title: random.seed not initialized as advertised
type: behavior
versions: Python 2.7

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue10025>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to