>If you add all the deviation scores, they will ALWAYS equal 0 and 0 divided 
by any number is always 0 because the value of the negative deviations from 
the mean always equal the positive deviations.  That is the nature of the 
arithmetic average.  Squaring the deviation scores turns the negative numbers 
into positive numbers.  The variance is the average of the squared deviation 
scores.  Taking the square root of the variance returns the average deviations 
to the original unit of measure.  Hope that helps.  I'm not a stat historian, 
but I do teach a stats course.

===== Original Message From [EMAIL PROTECTED] =====
>Greetings to all stats historians out there
>
>A student asked today, upon starting discussion of the standard deviation,
>why we do not simply use the average (mean) absolute value of the deviation
>scores, rather than taking the square root of the sum of squares.  Many
>introductory stats texts provide no explicit rationale or history of the
>standard deviation formula.  The mean absolute value of the deviation scores
>would seem to be a reasonable descriptive measure of the variability of the
>scores.
>
>TIA for any help you can provide.
>
>Linda Tollefsrud
>University of Wisconsin - Barron County
>1800 College Drive
>Rice Lake, WI  54868-2497
>[EMAIL PROTECTED]
>(715) 234-8176 ext. 5417

Sally A. Radmacher, Ph.D.
Professor of Psychology
Missouri Western State College
4525 Downs Drive
St. Joseph, MO  64507
[EMAIL PROTECTED]
(816) 271-4353

Reply via email to