On 11 Nov 2009, at 00:23, Roger Binns wrote:
As a developer past practise has trained me that integers are exact
and
floating point is approximate (also "fast" and "slow"
respectively). Other
than some older BASICs, Javascript is the first time in ages to come
across
a language that doesn't have integers, and representing everything
as float.
I looked up a few Javascript tutorials and didn't find a single one
stating
that all numbers are stored as float. In most cases they deliberately
distinguish between integers and floating point as two different
types.
Integers can have leading 0x/0 to specify hex/octal whereas floating
point
cannot is why they seem to make the distinction.
What difference does it make? An integer will never be "corrupted"
into anything other than the original value. The only real problem is
when trying to deal with decimals, as they will be "corrupted" to the
nearest representable value in floating point. Right? What am I missing.