[I am still recovering, so if I say something totally misinformed I blame my recovery. :) ]
-On [20080125 15:12], Christian Heimes ([EMAIL PROTECTED]) wrote: >Python 3:0 > > 2.4 ( 2, 3, 2, 2, 2) > 2.6 ( 2, 3, 3, 2, 2) >-2.4 (-3, -2, -2, -2, -2) >-2.6 (-3, -2, -3, -2, -2) > >Python 2.6: > 2.4 ( 2.0, 3.0, 2.0, 2, 2) > 2.6 ( 2.0, 3.0, 3.0, 2, 2) >-2.4 (-3.0, -2.0, -2.0, -2, -2) >-2.6 (-3.0, -2.0, -3.0, -2, -2) Am I the only one who wonders about the sudden change in decimal significance? Especially given the fact that the ISO C standard specifies floor(), for example, as returning a floating point value and the above in Python 3.0 deviates to just returning an integer. Which is also different from 2.5's behaviour. Can I assume we are all familiar with the concept of significant digits and that we agree that from this point of view 2 != 2.0? And that results such as the above would be a regression and loss in precision? -- Jeroen Ruigrok van der Werven <asmodai(-at-)in-nomine.org> / asmodai イェルーン ラウフロック ヴァン デル ウェルヴェン http://www.in-nomine.org/ | http://www.rangaku.org/ We have met the enemy and they are ours... _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com