Thanks again for the comments.

I think I need to reiterate that it's L1, not L2 regression that I am
interested in.

By orthogonality, Gottfried, I simply mean that whereas the "usual" L1
regression minimizes over all possible sets of coefficients
{Ci(i=1..n), Const} the sum (over the sample points) of |Y - (C1X1 +
... + CnXn+Const)| , the orthogonal regression aims to minimize the
sum of
|Y - (C1X1 + ... + CnXn+Const)|/sqrt(C1^2+...+Cn^2+1).

Toms algorithm 478 does the job for the "usual" L1 fit, but if I try
to apply the trick of rotating the coordinates and redoing the
algorithm, then if I use stabilization of Ci's as the stopping
criteria, the process never stops even with some 2-dimensional data
sets (flips back ad forth between 2 sets of coefficients), or, if I
continue only as long as the sum of orthogonal distances (as above)
decreases, then it stops before finding the true minimum (which I find
by "brute force", checking hyperplanes that pass through each
(n+1)-subset of data points).

For whatever it's worth, my client is planning to apply the algorithm
to time series of market prices, but, as I mentioned, I am mostly
concerned about implementing an algorithm that'll do the minimization
and less in whether its use is appropriate to the application.

Vitaly
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to