> People generally expect math to work how they've been taught in
> school. When javascript violates their expectations, that is the very
> definition of a bug.

School math is accurate, there are no approximations at all.  Neither
binary nor decimal are accurate, both will violate people's
expectations of accuracy.  If you do (1.0 / 3.0) * 3.0 neither binary
nor decimal give you 1.0.

>  These are facts that a novice programmer does not know or care about
> in the slightest. All they know is that even their desktop calculator
> can do this work properly, it must be javascript, or apple, or
> whatever that is broken. This is obvious to anyone that doesn't look
> at the world through nerd colored glasses. It's a fact so painful that
> microsoft even went to the trouble of pouring in an epic amount of
> research and development into their calc.exe so it doesn't display
> this broken looking behavior, as it displayed in earlier versions. A
> programmer might take the trouble to learn these esoteric computer
> facts, but end users won't.

If you're doing a calculator application you can use a decimal
library.  I'm not saying that decimal can't ever be useful, what I'm
saying is: it's not useful enough that we want to force all
implementations to include it.

> And finally, it's highly likely that javascript will be used more and
> more on the server side in years to come, so javascript being a
> "client side" language no longer works as a credible excuse.

Server side JS is a different situation and much easier.  On the
server you control the implementation you use.  Since JS lacks any way
to interact with its surroundings out of the box you're already more
or less forced to extend your implementation to be able to do
something meaningful.  If you want decimal arithmetic on the server
you're free to add it, either as a library or as a native extension.
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to