On 2014-09-10 15:06, Christophe Bal wrote:
Hello.

I would to do two thinks.

 1. Know the number of decimal digits that Sage uses in a program.
 2. Choose the number of decimal digits displayed.

I believe the recommended way to choose the precision is to do:

RR = RealField(1000)    # 1000 bits of precision

Then do all your computations in RR by converting your input to RR:

a = RR(1/3)
b = RR(pi)

When you now compute a+b, it will be computed to 1000 digits of precision.

--
You received this message because you are subscribed to the Google Groups 
"sage-support" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sage-support.
For more options, visit https://groups.google.com/d/optout.

Reply via email to