In article <82harm$1ae$[EMAIL PROTECTED]>,  <[EMAIL PROTECTED]> wrote:
>I turns out that the difference between the constrained and
>unconstrained log likelihoods has a chi-square distribution (dergr. of
>freedom =restrictions) when it is multiplied by two.

>Where the two comes from is something I am not getting. (or -2, the
>information I have isn't completely consistent, but I guess that just
>depends on which way the difference between the likelihoods is taken).

>Where is this 2 (or -2) comming from?

The first paragraph sketchily explains why the problem can
be reduced to one observation from a multivariate normal
distribution with independent coordinates with variance 1.
Then the calculation is made.

The general case is asymptotically that of a multivariate 
normal distribution with known covariance matrix, and the
particular covariance matrix is not important (algebra).
As the mean is a sufficient statistic, we can assume sample
size 1.  This case also reduces to the null hypothesis that
the entire mean is 0.

Now the logarithm of the likelihood function is 

        C - .5 * \sum (X_i - \mu_i)^2.

The C is unimportant; for the unconstrained maximum, 
the sum is 0, and for the constrained maximum, it is
chi-squared with the relevant number of degrees of
freedom.  But note the factor of -.5.
-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558

Reply via email to