https://bugs.documentfoundation.org/show_bug.cgi?id=160221

--- Comment #3 from Bruce H <[email protected]> ---
(In reply to ady from comment #2)
> (In reply to Bruce H from comment #0)
> 
> > 11.92
> > 11.29
> > =A1-A2
> > 
> > If the lines above are entered into a new spreadsheet, then the result that
> > appears in cell A3 is 0.63 (as one would expect).
> 
> Just expand the width of column A and you should see the same as the
> import-from-CSV value.


I was just describing what I see on the screen when no cell formatting has been
done, and no manual changing of column size has been done.  If you don't change
those things, then the behavior (i.e. what's displayed on screen) appears
different.

OK, then let me try a different approach:

Cells have some default formatting (or so I assume).  If the user has not
modified the formatting of a cell, then it appears that at least 15 digits of
precision are displayed by default.  That seems ill-advised, considering that
we KNOW that the accuracy of any floating point math operation is generally
limited to 14 digits.  Given that low-order digits in the 15th (or more)
position are not reliable, why not change the "default precision" to 14
significant digits?

Another way to look at it is, once you've gone beyond the max count of reliable
digits, then any choice for a precision length seems arbitrary.  Why not choose
to display 20 digits?  or 50?

-- 
You are receiving this mail because:
You are the assignee for the bug.

Reply via email to