At 6:12 PM -0800 1/17/00, ELN/fisackson wrote:
>Gentlefolk,
>
>There exists a technique known as Orthogonal Distance Regression (aka
>Deming regression) to establish if a linear relationhsip exists between
>two variables when both are subject to error. A few statistics packages
>even calculate it. I wonder if there exists a similar technique to
>calculate a correlation coefficient bewteen two such variables, of if the
>old Pearson correlation coefficient does not assume inerrancy in one
>variable and is thus a sound measure. ID the Pearson moment is
>unsatisfactory and someone knows of an algorithm or equations from which
>one might calculate a suitable measure, I'd be grateful to hear from you.
>
>I apologize if the question is trivial but all my reference materials are
>stil in an unpacked state somewhere between Atlantis and Atlanta (;-)
>
>Thanks for your indulgence,
>
>Frank Isackson

Orthogonal Distance Regression is the technique described by Pearson
in 1901 -- it amounts to computing the principal component corresponding
with the smallest eigenvalue of the dispersion matrix of (x|y).

Since you only have two variances and one covariance you cannot compute
the correlation if there is error in both variables: the model is
under-identified.

A nice reference, if I say so myself, is

K. van Montfort, A. Mooijaart, and J. de Leeuw
Regression with errors in variables: estimates based on third-order
moments
Statistica Neerlandica, 41, 1987, 223-237.

===
Jan de Leeuw; Professor and Chair, UCLA Department of Statistics;
US mail: 8142 Math Sciences Bldg, Box 951554, Los Angeles, CA 90095-1554
phone (310)-825-9550;  fax (310)-206-5658;  email: [EMAIL PROTECTED]
    http://www.stat.ucla.edu/~deleeuw and http://home1.gte.net/datamine/
============================================================================
          No matter where you go, there you are. --- Buckaroo Banzai
============================================================================

Reply via email to