Ok, I had to make a huge effort to accept this, but it's more clear now.

One last question: Why RDF does not incorporate this feature? because it 
comes from the GSL library, that is an independent project? or because its 
precision is known a priori, like the float type in Python?


On Monday, October 6, 2014 3:39:32 PM UTC-3, Jeroen Demeyer wrote:
>
> On 2014-10-06 18:03, João Alberto wrote: 
> > Is this a correct behavior of Sage? 
> It's a feature, not a bug. The reason is that the number of digits gives 
> an idea about the precision of the number. Compare 
>
> sage: RealField(20)(1) 
> 1.0000 
> sage: RealField(100)(1) 
> 1.0000000000000000000000000000 
>
> If both these would be printed as "1.0", you would lose this information 
> about the precision. 
>
> Python has a fixed precision of 53 bits and prints a minimal number of 
> digits. 
>

-- 
You received this message because you are subscribed to the Google Groups 
"sage-support" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-support+unsubscr...@googlegroups.com.
To post to this group, send email to sage-support@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-support.
For more options, visit https://groups.google.com/d/optout.

Reply via email to