In article <[EMAIL PROTECTED]>,
Richard A. Beldin <[EMAIL PROTECTED]> wrote:
>This is a multi-part message in MIME format.
>--------------20D27C74B83065021A622DE0
>Content-Type: text/plain; charset=us-ascii
>Content-Transfer-Encoding: 7bit
>I have long thought that the usual textbook discussion of independence
>is misleading. In the first place, the most common situation where we
>encounter independent random variables is with a cartesian product of
>two indpendent sample spaces. Example: I toss a die and a coin. I have
>reasonable assumptions about the distributions of events in either case
>and I wish to discuss joint events. I have tried in vain to find natural
>examples of independent random variables in a smple space not
>constructed as a cartesian product.
>I think that introducing the word "independent" as a descriptor of
>sample spaces and then carrying it on to the events in the product space
>is much less likely to generate the confusion due to the common informal
>description "Independent events don't have anything to do with each
>other" and "Mutually exclusive events can't happen together."
>Comments?
The usual definition of "independence" is a computational
convenience, but an atrocious definition. A far better
way to do it, which conveys the essence, is to use
conditional probability. Random variables, or more
generally partitions, are independent if, given any
information about some of them, the conditional
probability of any event formed from the others is the
same as the unconditional probability. This is the way
it is used.
As for a "natural" example not coming from a Cartesian
product, consider drawing a hand from an ordinary deck
of cards. On another newsgroup, someone asked for a
proof that the number of aces and the number of spades
was uncorrelated; they are not independent. The proof
I posted used that for the i-th and j-th cards dealt,
the rank of the i-th card and the suit of the j-th are
independent. For i=j, this can be looked upon as a
product space, but not for i and j different.
There are other examples. The independence of the sample
mean and sample variance in a sample from a normal
distribution is certainly an important example. The
independence of the various sample variances in an ANOVA
model is another. The independence for each t of X(t)
and X'(t) in a stationary differentiable Gaussian
process is another.
This is thrown together off the cuff. There are lots of
others.
--
This address is for information only. I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054 FAX: (765)494-0558
=================================================================
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
http://jse.stat.ncsu.edu/
=================================================================