Thanks to Mr Dalgaard for his advice and everyone else who has
contributed. Inclusion of an error term at the end of sim.set$y = ...
line did cure my problems with drop1() and step().
I suppose it is my own inexperience in carrying out simulations caused
such gaffe.
Thomas
Thomas P C Chu wrote:
Dear all,
I have been trying to investigate the behaviour of different weights
in weighted regression for a dataset with lots of missing data. As a
start I simulated some data using the following:
library(MASS)
N - 200
sigma - matrix(c(1, .5, .5, 1), nrow = 2)
sim.set
I am not sure why my messages are not threaded together. Thank you to
the author of this post:
https://stat.ethz.ch/pipermail/r-help/2008-August/169691.html
I have tried the suggestions, but I got the same results as in my
original query:
Dear all,
I have been trying to investigate the behaviour of different weights in
weighted regression for a dataset with lots of missing data. As a start
I simulated some data using the following:
library(MASS)
N - 200
sigma - matrix(c(1, .5, .5, 1), nrow = 2)
sim.set -
Interestingly, if I fitted the model using glm() rather than lm(),
drop1() would behave as expected:
summary(model.glm - glm(y ~ ., data = sim.set, family = 'gaussian'))
summary(model.lm - lm(y ~ ., data = sim.set))
drop1(model.glm, test = 'F')
drop1(model.lm, test = 'F')
model.glm -
Thomas Chu wrote:
Neither of those 3 lines of commands managed to drop x4 and its P value
magically decreased from 0.94 to almost 0! I am also baffled by how R
calculated those RSS.
Maybe it is using a different type of SS. If i have a lm() model, and i do:
6 matches
Mail list logo