In article <8928qk$9ov$[EMAIL PROTECTED]>, Victor Aina <[EMAIL PROTECTED]> wrote:
>Hi all:
>I was wondering if anyone might have an opinion
>about the impact of subtracting a constant e.g.
>the mean of a variable from the regressor that
>happens to be collinear with another one.
>A little algebra demonstrates that for least
>squares regression, only the intercept term
>changes when a constant is subtracted from a variable.
>The other slope coefficients remain unchanged.
>It will be nice to know how prediction/forecasting
>is affected. Surely the condition index falls.
>Is collinearity masked in some way? Are the coefficents
>more efficient (in terms of variance) after subtarcating
>the mean?
>What happens if we have a nonlinear regression model
>such as logistic regression etc.
If there is a linear term, nothing except the coefficient
of the linear term, and the joint distribution of the
estimated linear term with the other ones, is changed by
this process. This applies if one subtracts constants
from as many of the variables as one wants.
For any model, changing the parametrization only changes
the way that one presents the results, not the results
of the investigation.
--
This address is for information only. I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054 FAX: (765)494-0558
===========================================================================
This list is open to everyone. Occasionally, less thoughtful
people send inappropriate messages. Please DO NOT COMPLAIN TO
THE POSTMASTER about these messages because the postmaster has no
way of controlling them, and excessive complaints will result in
termination of the list.
For information about this list, including information about the
problem of inappropriate messages and information about how to
unsubscribe, please see the web page at
http://jse.stat.ncsu.edu/
===========================================================================