Hi all:

I was wondering if anyone might have an opinion
about the impact of subtracting a constant e.g.
the mean of a variable from the regressor that
happens to be collinear with another one.

A little algebra demonstrates that for least
squares regression, only the intercept term
changes when a constant is subtracted from a variable.
The other slope coefficients remain unchanged.

It will be nice to know how prediction/forecasting
is affected. Surely the condition index falls.
Is collinearity masked in some way? Are the coefficents
more efficient (in terms of variance) after subtarcating
the mean?

What happens if we have a nonlinear regression model
such as logistic regression etc.

Thanks for your opinion.

Victor.
--


+----------------------------------------------------------+
| victor aina  |  e-mail: [EMAIL PROTECTED] | fax:(604) 291-5944 |


===========================================================================
This list is open to everyone.  Occasionally, less thoughtful
people send inappropriate messages.  Please DO NOT COMPLAIN TO
THE POSTMASTER about these messages because the postmaster has no
way of controlling them, and excessive complaints will result in
termination of the list.

For information about this list, including information about the
problem of inappropriate messages and information about how to
unsubscribe, please see the web page at
http://jse.stat.ncsu.edu/
===========================================================================

Reply via email to