Okay, so how do I get decimal to set precision of *significant digits*?
That is exactly what it is doing. getcontext().prec sets how many digits are used to represent the mantissa of the number.
Have you taken any physics classes? What is 1000/7 to two significant digits? It is 140, not 142.85, which has *5* significant digits. Changing prec has just this effect:
>>> from decimal import *
>>> getcontext().prec = 2
>>> one = Decimal(1)
>>> seven = Decimal(7)
>>> one/seven
Decimal("0.14")
>>> 1000 * one/seven
Decimal("1.4E+2")
>>> getcontext().prec = 20
>>> one/seven
Decimal("0.14285714285714285714")
>>> 1000 * one/seven
Decimal("142.85714285714285714")
Why have a decimal.getcontext().prec if it doesn't provide a useful result? The number of digits in a number is irrelevant to that numbers value. It just doesn't make sense to me.
It's just like choosing between float and double in Java or C - it sets the precision of the underlying representation. It is a tradeoff between accuracy, speed, and memory use.
Kent
_______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor