On Thu, 5 Jun 2003 15:40:52 +0000
"Wegmann (LIST)" <[EMAIL PROTECTED]> wrote:

> Hello R-user
> 
> I want to compute a multiple regression but I would to include a check for 
> collinearity of the variables. Therefore I would like to use a ridge 
> regression. 
> I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p 
> of the whole model) out of this model. Can anybody tell me how to get a 
> similar output like the summary(lm(...)) output? Or if there is another way 
> (e.g. subcommands of lm() ) to include a correction for collinearity. 
> 
> I hope I was precise enough and included all necessary information otherwise I 
> can add some more infos. 
> 
> thanks in advance, Cheers Martin

This doesn't really answer your question but the Design packages's ols function is 
another way to handle penalized least squares.  ols has advantages if you want to 
differentially penalize different types of terms in the model or if you have any 
categorical predictors.  Ordinary ridge regression does not correctly scale such 
variables in my opinion.

The anova method for ols fits 'works' when you penalize the model but there is some 
controversy over whether we should be testing biased coefficients.  Some believe that 
hypothesis tests should be done using the unpenalized model.  That brings up other 
ways to handle collinearity: test groups of variables in combination so they don't 
compete with each other, or collapse them into summary scores (e.g., principal 
components) before putting them in the model. 

---
Frank E Harrell Jr              Prof. of Biostatistics & Statistics
Div. of Biostatistics & Epidem. Dept. of Health Evaluation Sciences
U. Virginia School of Medicine  http://hesweb1.med.virginia.edu/biostat

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help

Reply via email to