On Tuesday, April 18, 2023 at 4:19:45 PM UTC-6 Nils Bruin wrote:

The present default is to use "1.1" as a designation of a floating point 
literal, with in implied precision derived from the number of digits used 
to write down the mantissa. 


This is true in some context, but not others. 

In a physics or engineering context, everyone would see 1.1 as having only 
two significant figures. 

In a *math* context, no one would see it as having only two significant 
figures. 

Here's a hypothetical test you could do:

Suppose someone scrawls "1.1" on the blackboard in the math common room, 
and a mathematician walks in and sees it, what does he think about its 
value?

If you ask him to paraphrase the value of it, he would say "11/10", "1 plus 
1/10", or something else equivalent to 11/10.

It would not occur to him that it could also be a finite-precision floating 
point quantity, unless you told him that 11/10 was not the right 
interpretation.

In other words, the mathematician's default semantics for "1.1" is 11/10.

The physicist's default semantics is 1.1 * 10^0, only two significant 
figures.

Which semantics should Sage use?

I would say Sage is for math, so it should use the math semantics.

In high-precision environments like RealField(1000), Sage should 
*definitely* use the math semantics, because physics people, or engineers, 
or any other applied type folks, have zero use for 1000 bits of precision 
in anything that they do. 

Only a math person, doing math, has a use for 1000 bits.

So RealField and RealLazyField should use the math semantics.

They should see 1.1 as 11/10.

-aw

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sage-devel/3ccd36cc-024d-460c-81d8-8ab8661ee889n%40googlegroups.com.

Reply via email to