On 2019-01-24 04:45, Ethan Resnick wrote:
Well, if you remove the trailing 0s you get an entirely different number.
That's pretty significant.
Note that this is the default ES serialization as well.
This makes no sense to me. Yes, removing trailing 0s, and therefore changing
the magnitude of the number, changes its meaning. But significant digits are
about capturing precision, not magnitude.
Hi Ethan,
I'm neither the designer of this API nor have I looked at the implementations
either I guess that 21 comes from how number serializer works without locale
settings.
Let's make this concrete:
The number 134449999999999984510435328 happens to have an exact floating point
representation. However, because that number is larger than the max safe
integer, many other integers are best approximated by the same floating point
value. 134449999999999980000000000 is one such number.
So, if you do:
134449999999999984510435328..toLocaleString('en', { maximumSignificantDigits:
21, useGrouping: false })
and
134449999999999980000000000..toLocaleString('en', { maximumSignificantDigits:
21, useGrouping: false })
you actually get the same output in each case, which makes sense, because both
numbers are represented by the same floating point behind the scenes.
Right, the ES number serializer doesn't take these edge cases in consideration.
Now, it seems like the serialization logic in `toLocaleString` (or
`toPrecision`) has two options.
First, it could assume that the number it's serializing started life as a
decimal and got converted to the nearest floating point, in which case the
serialization code doesn't know the original intended number. In this case, its
best bet is probably to output 0s in those places where the original decimal
digits are unknown (i.e., for all digits beyond the precision that was stored).
This is actually what toLocaleString does; i.e., all digits after the 17th are
0, because 64-bit floating points can only store 17 decimal digits of
precision. This is where my original question came in, though: if a float can
only encode 17 digits of precision, why would the maximumSignificantDigits be
capped at 21? It seems like the values 18–21 are all just equivalent to 17.
The other option is that the serialization code could assume that the number stored in
the float is exactly the number the user intended (rather than a best approximation of
some other decimal number). This is actually what `toPrecision` does. I.e., if you call
`toPrecision(21)` on either of the numbers given above, you get 21 non-zero digits,
matching the first 21 digits of the underlying float value:
`"1.34449999999999984510e+26"`. But, again, the limit of 21 seems odd here too.
Because, if you're going to assume the float represents exactly the intended number, why
not be willing to output all 27 significant digits in the decimal above? Or more than 27
digits for the decimal representation of bigger floats?
Did you try:
(1.34449999999999984510e+250).toLocaleString('en', { maximumSignificantDigits:
21, useGrouping: false })
In Chrome I actually got 250 digits!
My conclusion is that the internationalization API wasn't designed for
"scientific" work.
It was probably created for displaying "normal" numbers whatever that means :-)
Anders
In other words, it seems like `maximumSignificantDigits` should either be
capped at 17 (the real precision of the underlying float) or at 309 (the length
of the decimal representation of the largest float). But neither of those are
21, hence my original question...
On Mon, Jan 21, 2019 at 2:32 AM Anders Rundgren <[email protected]
<mailto:[email protected]>> wrote:
This limit seems a bit strange though:
console.log(new Intl.NumberFormat('en', { maximumFractionDigits: 20
}).format(-0.0000033333333333333333));
Result: -0.00000333333333333333
That's actually two digits less than produced by the default ES
serialization.
"maximumFractionDigits" is limited to 20.
Anders
On 2019-01-21 06:54, Ethan Resnick wrote:
> if you input this in a browser debugger it will indeed respond with
the same 21 [sort of] significant digits
>
> 999999999999999900000
>
> I'm pretty sure the 0s don't count as significant digits
<https://www.wikiwand.com/en/Significant_figures> (and, with floating point
numbers, it makes sense that they wouldn't).
>
> l this is probably best asked at https://github.com/tc39/ecma402,
since it seems to imply a potential spec bug.
>
>
> Although my question was framed in terms of NumberFormat, I don't actually
think this is Ecma 402-specific. Specifically, I believe the limit started, or at least
also applies to, the Number.prototype.toPrecision
<https://www.ecma-international.org/ecma-262/6.0/#sec-number.prototype.toprecision>
API from Ecma 262 (where it is equally unexplained).
>
> That's true for decimal values, but the limit of 21 would also
include the fractional portion of the double value as well, so would need more
than 17, I think?
>
>
> My understanding of floating point encoding is that 17 digits will also
cover the fractional portion. The only case I can think of where 17 digits might
not be enough is if the number system is not base 10; e.g., a base 6 number system
would presumably require more digits. But, I don't see any such number systems as
output options in the NumberFormat API, and such localization concerns don't
really explain the limit in N.p.toPrecision linked above, which is definitely
dealing with base 10.
>
> On Sun, Jan 20, 2019 at 4:48 PM Logan Smyth <[email protected]
<mailto:[email protected]> <mailto:[email protected]
<mailto:[email protected]>>> wrote:
>
> It does seem unclear why the limit is 21. Is that maybe the most you
need to uniquely stringify any double value?
>
> > an only encode up to 17 significant decimal digits
>
> That's true for decimal values, but the limit of 21 would also
include the fractional portion of the double value as well, so would need more
than 17, I think?
>
> On Sun, Jan 20, 2019 at 1:18 PM Isiah Meadows <[email protected]
<mailto:[email protected]> <mailto:[email protected]
<mailto:[email protected]>>> wrote:
>
> I feel this is probably best asked at
https://github.com/tc39/ecma402, since it seems to imply a potential spec bug.
>
> -----
>
> Isiah Meadows
> [email protected] <mailto:[email protected]>
<mailto:[email protected] <mailto:[email protected]>>
> www.isiahmeadows.com <http://www.isiahmeadows.com>
<http://www.isiahmeadows.com>
>
>
> On Sun, Jan 20, 2019 at 2:31 PM Anders Rundgren <[email protected]
<mailto:[email protected]> <mailto:[email protected]
<mailto:[email protected]>>> wrote:
>
> On 2019-01-20 20:18, Ethan Resnick wrote:
> > Hi,
> >
> > Apologies if es-discuss is the wrong venue for this; I've
tried first poring through the specs and asking online to no avail.
> >
> > My question is: why is the limit for the
`maximumSignificantDigits` option in the `NumberFormat` API set at 21? This seems
rather arbitrary — and especially odd to me given that, iiuc, all Numbers in JS, as
64 bit floats, can only encode up to 17 significant decimal digits. Is this some sort
of weird historical artifact of something? Should the rationale be documented
anywhere?
>
> I don't know for sure but if you input this in a browser
debugger it will indeed respond with the same 21 [sort of] significant digits
> 999999999999999900000
>
> rgds,
> Anders
> >
> > Thanks!
> >
> > Ethan
> >
> > _______________________________________________
> > es-discuss mailing list
> > [email protected] <mailto:[email protected]>
<mailto:[email protected] <mailto:[email protected]>>
> > https://mail.mozilla.org/listinfo/es-discuss
> >
>
> _______________________________________________
> es-discuss mailing list
> [email protected] <mailto:[email protected]>
<mailto:[email protected] <mailto:[email protected]>>
> https://mail.mozilla.org/listinfo/es-discuss
>
> _______________________________________________
> es-discuss mailing list
> [email protected] <mailto:[email protected]>
<mailto:[email protected] <mailto:[email protected]>>
> https://mail.mozilla.org/listinfo/es-discuss
>
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss