It is not difficult to calculate the convolution exactly - just a bit
messy.   You can do it easily with a symbolic algebra package.
For the simpler case in which the two normal distributions
have zero means, but different variances, the density of the
convolution seems to be:

K.exp{-0.5x^2/(s_1^2 + s_2^2)}.{PHI(x1) - PHI(x2)}

where s_1, s_2 are the two std.deviations,
the normalizing constant K is 4/sqrt{2.pi.(s_1^2 + s_2^2)},
PHI is the std. normal distribution function and the arguments
are:
x1 = (s_2/s_1).x / sqrt(s_1^2 + s_2^2)
x2 = (- s_1/s_2).x / sqrt(s_1^2 + s_2^2)

In the case of equal variances, this simplifies to:

K.exp{-0.25x^2 / s^2}.{PHI(x / (s.root(2)) ) - PHI(- x / (s.root(2)) )}

where K = 2 / {sqrt(pi.s^2)}.

The density function looks very much like the Rayleigh distribution.

N.B. I did the above using paper & pencil - it may be WRONG!

--
Alan Miller, Retired Scientist (Statistician)
CSIRO Mathematical & Information Sciences
Alan.Miller -at- vic.cmis.csiro.au
http://www.ozemail.com.au/~milleraj

H. J. Wang wrote in message <[EMAIL PROTECTED]>...
>Hi,
>
>Suppose X, Y are independent random variables with normal distributions.
>The means and variances are different.  Assume X1 and Y1 are random
>variables with the probability distributions f(X | X>=0) and g(Y|Y>=0),
>respectively. That is, X1 and Y1 the non-negative truncations of X and
>Y, respectively. Does anyone know whether in this case Z = X1 + Y1 is
>still a truncated normal? Any reference on this?  Thanks in advance!

Reply via email to