Hi,

Statistics in Sage is improving but a lot needs to be done to make it
easy to use.  There has been some progress on this front, but to get
more advanced functionality you need to use things in scipy/numpy (for
example, do:
import scipy.stats as stat
and then tab-complete
stat.[tab]
to see a lot of what is available there), or you can use the R
interpreter but that is not very user-friendly right now.

It looks like standard deviation is available at top-level as "std",
but I think you are right that it isn't easy enough to find.
Strangely enough, using a RR vector with std works OK, but doesn't
with RDF!  I think this is because many operations over RDF get sent
to numpy and scipy, and the conversions aren't defined for RR.  This
seems like a bug to me.

-Marshall Hampton

On Apr 7, 10:29 am, "Kenneth A. Ribet" <[email protected]> wrote:
> Hello All,
>
> I asked myself how I could use sage to compute the standard deviation of a 
> grade distribution for one of my courses.  Rooting around, I found that I can 
> compute for example
>
> sage: vector(RDF,[1,2,2,1]).standard_deviation()
>
> and get the answer 0.57735026919.  However, if I try the same command with 
> "RDF" replaced by "RR," I get anAttributeError.  My first question is: What's 
> going on here; how come RDF and RR are so different in this context?  Their 
> respective descriptions look very similar --
>
> "An approximation to the field of real numbers using double precision 
> floating point numbers. Answers derived from calculations in this 
> approximation may differ from what they would be if those calculations were 
> performed in the true field of real numbers. This is due to the rounding 
> errors inherent to finite precision calculations."
>
> "An approximation to the field of real numbers using floating point numbers 
> with any specified precision. Answers derived from calculations in this 
> approximation may differ from what they would be if those calculations were 
> performed in the true field of real numbers. This is due to the rounding 
> errors inherent to finite precision calculations."
>
> If I had found some documentation about the standard deviation command, I 
> would probably have have found the answer to my first question.  This leads 
> to my second question: Why I don't I see information about standard_deviation 
> when I type "standard_deviation?" at the command line?
>
> Thanks in advance for the help!
>
> Ken Ribet

-- 
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sage-support
URL: http://www.sagemath.org

To unsubscribe, reply using "remove me" as the subject.

Reply via email to