Doubling the length of the data doubles the apparent number of observations. 
You would expect the standard error to reduce by sqrt(2) (which it just about 
does, though I'm not clear on why its not exact here)

Weights are not as simple as they look. You have given all your data the same 
weight, so the answer is independent of the weights (!). Try again with 
weights=rep(4,100) etc. Equal weights simply cancel out in the lm process. In 
fact, some linear regression algorithms rescale all weights to sum to 1; in 
others, weights are scaled to average 1; done 'naturally' the weights simply 
appear in two places which cancel out in the final covariance matrix 
calculation (eg in the weighted 'residual sd' and in the hessian for the 
chi-squared function, if I remember correctly). 

Bottom line - equal weights make no difference in lm, so choose what you like. 
1 is a good number, though.

Steve e

>>> "hadley wickham" <[EMAIL PROTECTED]> 08/05/2007 10:08:34 >>>
Dear all,

I'm struggling with weighted least squares, where something that I had
assumed to be true appears not to be the case.  Take the following
data set as an example:


*******************************************************************
This email and any attachments are confidential. Any use, co...{{dropped}}

______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to