Hello All,

I asked myself how I could use sage to compute the standard deviation of a 
grade distribution for one of my courses.  Rooting around, I found that I can 
compute for example

sage: vector(RDF,[1,2,2,1]).standard_deviation()

and get the answer 0.57735026919.  However, if I try the same command with 
"RDF" replaced by "RR," I get anAttributeError.  My first question is: What's 
going on here; how come RDF and RR are so different in this context?  Their 
respective descriptions look very similar --

"An approximation to the field of real numbers using double precision floating 
point numbers. Answers derived from calculations in this approximation may 
differ from what they would be if those calculations were performed in the true 
field of real numbers. This is due to the rounding errors inherent to finite 
precision calculations."

"An approximation to the field of real numbers using floating point numbers 
with any specified precision. Answers derived from calculations in this 
approximation may differ from what they would be if those calculations were 
performed in the true field of real numbers. This is due to the rounding errors 
inherent to finite precision calculations."

If I had found some documentation about the standard deviation command, I would 
probably have have found the answer to my first question.  This leads to my 
second question: Why I don't I see information about standard_deviation when I 
type "standard_deviation?" at the command line?

Thanks in advance for the help!

Ken Ribet

-- 
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sage-support
URL: http://www.sagemath.org

To unsubscribe, reply using "remove me" as the subject.

Reply via email to