[EMAIL PROTECTED] (Donald Burrill) wrote in message news:<[EMAIL PROTECTED]>...
> On 28 Feb 2003, Patrick Noffke wrote:
> 
> > I have a regression application where I fit data to polynomials of
> > up to 3 inputs (call them x1, x2, and x3).
> 
> If I understand you correctly, you have three predictors (since you call
> them x1, x2, x3;  if they were response variables I would expect you to
> have called them y1, y2, y3).  Do you in fact have more than one
> response variable, y, say?  If what you're trying to fit is a model of
> the general form
> 
>       y = f(x1, x2, x3)
> 
> where f is a polynomial function in three variables, this is only
> multiple (univariate) regression, not multivariate regression.  But I
> may have misunderstood you.
> 

Yes, that's correct...three inputs and one response variable. 
Although I still think this is referred to as multivariate regression
(y is a function of _multiple variables_).  I believe multiple
regression is short for multiple linear regression, where y is linear
in the coefficients of several terms.  Admittedly, there is no reason
these terms cannot be of multiple variables, but then I guess both
terms (multiple and multivariate regression) apply.

> > The problem I'm having is selecting terms for the regression since
> > there is a lot of inter-dependence (e.g. x1^4 looks a lot like
> > x1^6).  If I add one more term, the coefficients of other terms
> > change greatly, and it becomes difficult to evaluate the
> > significance of each term and the overall quality of the fit.
> 
> Yes. This is characteristic of variables obtained by simply taking
> successive powers, especially if the original variable (x1, say) only
> took on positive values.  This is the reason textbooks on multiple
> regression (see, e.g., Draper & Smith, Applied regression analysis)
> often recommend using orthogonal polynomials.
> 
> > With one variable, the answer is to use orthogonal polynomials.
> > Then the lower order coefficients don't change by adding higher
> > order terms.  Does this concept extend to multivariate regression?
> 
> Yes; especially if by "multivariate" you really mean "multiple".
> It can be somewhat more complicated, though.
> 
> > Are there multivariate orthogonal polynomials?
> 
> You can specify orthogonal polynomials for each predictor separately.
> The complicated part arises when you want to control also for
> correlations between x1 (and its polynomials) and x2 (and its
> polynomials).  [Aside:  polynomials may be used as surrogates for other

Well, x1, x2, and x3 are highly correlated, so I would need to account
for this in forming the polynomials.

> kinds of nonlinear functions -- log(x), log(y), e^x, e^y spring to mind
> -- and in such cases it is nearly always *much* better to fit the proper
> function rather than a polynomial approximation.  If you want more
> commentary on this point, ask.]
> 

Understood.  I know of no simple function that describes the
relationship between y and x1, x2, and x3.  I wish there were one...my
life would be much simpler!

> > My first guess would be there are (maybe with terms like cos^2(u),
> > cos(u)cos(v), and cos^2(v)?).
> 
> Umm... these don't look like polynomials to me;  and I do not see what
> (if any) relation you are visualizing between the original variables x1,
> x2, x3 and these variables u, v.
> 

Well, I was thinking something like x^2, x*y, y^2, where x = cos(u),
and y = cos(v).  I thought this is how the Chebyshev polynomials are
formed in the 1-D case.

Anyway, I have some reading to do with your response and the other one
by Oscar Lanzi in sci.math.  Thank you very much for your help.

Pat
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to