In article <[EMAIL PROTECTED]>,
kjetil halvorsen <[EMAIL PROTECTED]> wrote:
>Slutsky's theorem says that if Xn ->(D) X and Yn ->(P) y0, y0 a
>constant, then
>Xn + Yn ->(D) X+y0. It is easy to make a counterexample if both Xn and
>Yn converges in distribution. Anybody have an counterexample when Yn
>converges in probability to a non-constant random variable?
This is very easy. For example, let Xn all be the same
nontrivial normal random variable Z, let Yn and y0 be -Z,
and let X be an independent normal random variable. Then
Xn + Yn = 0 for all n, but X - Z is not zero.
--
This address is for information only. I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054 FAX: (765)494-0558
=================================================================
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
http://jse.stat.ncsu.edu/
=================================================================