On Thu, Aug 30, 2007 at 12:52:33AM -0700, Andrew Lentvorski wrote:
> Todd Walton wrote:
> 
> >5 : statistically independent
...
An oversimplification.
> 
> Wikipedia has a nice discussion about the various uses of "orthogonal".
> http://en.wikipedia.org/wiki/Orthogonality
> 
> I tend to use it in the sense of "a and b are two characteristics, 
> variables, methods, etc.  which comprise a larger space of many 
> variables, methods, etc.  If a change in a causes no change in b, they 
> are orthogonal."
> 
> I will also use the word "independent".
...
When discussing probability and statistics, the word "orthogonal"
is likely to mean "uncorrelated".  "Independent" has a different
and much stronger meaning.  Roughly speaking, a and b are independent
if knowing one does not give _any_ information about the other.
a and b are uncorrelated (orthogonal) if the best _linear_ function
(best in the sense of least mean square error) to predict one from
the other is a constant.

It happens that if a and b are (jointly) Gaussian, i.e. normally
distributed, then they are independent if and only if they are
uncorrelated. On the other hand, if a is Gaussian with mean 0 and
b=|a| or b=a*a, a and b are uncorrelated but b can be exactly
predicted knowing a.  b is totally dependent.

Stewart Strait


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to