The next version of ECMAScript will probably have a 'decimal' datatype
for doing decimal math. Using this datatype, 0.3 + 0.7 would be exactly
1.0, not something like 0.9999999999999997 as you currently get due to
conversion from decimal to binary fractions.
 
This datatype would probably support additional precision as well.
Number only gives you 15 or 16 signficant digits. But if you had, say,
34, you could represent up to
$99,999,999,999,999,999,999,999,999,999,999.99 exactly, and that's
pretty large!
 
The Player team is thinking about how to introduce a type like this even
before the ECMAScript spec is complete, hopefully in a way that will be
compatible with the spec. They'd like to gather some input on
developers' requirements for decimal math. Some questions to think about
are...
 
What is your use case? Financial calculations? Scientific calculations?
 
Are you mainly interested in calculating with decimal rather than binary
fractions, or in having more significant digits, or are both important?
 
Do you need support for an arbitrary number of significant digits (i.e.,
"infinite precision")?
 
If not, how many significant digits are sufficient?
 
Do you need programmatic control over how much precision is used in
calculations (e.g., rounding to 5 decimal places in every intermediate
operation)?
 
Do you need programmatic control over how rounding works? (Round down,
round up, round to nearest, what happens with 1.5, etc.)
 
Do you care about whether a new type like 'decimal' gets automatically
coerced to other types like Number, int, and uint?
 
- Gordon
 
 

Reply via email to