On 12 Jan 2000 09:05:37 -0800, [EMAIL PROTECTED] (steinberg)
wrote:

> I am seeking better understanding of the concept of degrees of
> freedom. Here's what I think I know:

 - I am going to make my comment in an unusual way this time, by
inserting [   ] with my corrections --
 
> 1) Whenever a sum of squares is estimated, the result is
> constrained by the fact that the deviations about the mean 
  [, using the mean of the sample, ]  must
> sum to zero. The number of scores free to vary is therefor n-1.
> 
> 2) When estimating the population SD from a sample, SS/n is a
> biased estimate because the sample 
  [DELETE, " ... tends to be less variable than  the population from 
  which it comes. ][ADD,  ... SS is exactly as variable around the
  TRUE mean, but has to give a smaller number when you compute it
  around the sample's mean instead of the true mean.]
>SS/(n-1) is an unbiased estimate.

 - I hope that answers (a), too.


> b) I am looking at the ANOVA table for a regression with two
> preditors and n=395. The total df is 394. I can explain that from
> either 1 or 2 above. However the df for regression is 2. Doesn't
> the fact that a sum of squares was computed for regression have
> an impact here as in 1 above? Isn't SSR also an estimate as in 2
> above?

N is 395.  
Total DF (around the mean) is 394.  
If there are 2 more DF for the regression, then 
Residual DF (around the regression line) is 392, which is the DF that
goes with the residual, SSResid.  "SSR" is ambiguous here.

-- 
Rich Ulrich, [EMAIL PROTECTED]
http://www.pitt.edu/~wpilib/index.html

Reply via email to