I have a regression application where I fit data to polynomials of up
to 3 inputs (call them x1, x2, and x3).  The problem I'm having is
selecting terms for the regression since there is a lot of
inter-dependence (e.g. x1^4 looks a lot like x1^6).  If I add one more
term, the coefficients of other terms change greatly, and it becomes
difficult to evaluate the significance of each term and the overall
quality of the fit.

With one variable, the answer is to use orthogonal polynomials.  Then
the lower order coefficients don't change by adding higher order
terms.  Does this concept extend to multivariate regression?  Are
there multivariate orthogonal polynomials?  My first guess would be
there are (maybe with terms like cos^2(u), cos(u)cos(v), and
cos^2(v)?).  Does the idea that addition of higher order terms won't
affect your lower order coefficients also extend to the multivariate
case?

Could someone please enlighten me or point me to a reference
(preferrably one with practical & useful examples in addition to the
theory)?

Thank you kindly in advance for your help,
Pat
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to