In article <[EMAIL PROTECTED]>,
Mobile Survey <[EMAIL PROTECTED]> wrote:
>First of all I thank everyone here for their prompt responses.
>[EMAIL PROTECTED] (Herman Rubin) wrote in message 
>news:<a604p8$[EMAIL PROTECTED]>...
>> Non-normality will cause problems with classical testing,
>> but that is questionable in any case.

>But would not using GLS help me get over this? I cannot use MLE
>because I have a fairly big sample size (more than 300)and I
>understand that MLE is extremely sensitive for sample sizes above 200.

I can see no reason why MLE should be sensitive to large
sample sizes; if the model is even fairly close to correct,
the larger the sample the better.  GLS is essentially a
version of MLE, anyhow.  However, if the model is even
slightly wrong, be prepared to find it rejected for large
samples.  This does not mean it should be discarded; the
real question is whether it is close enough that it should
be used.

>> If one uses a loading-normalized model with covariances,
>> not correlations, then the asymptotic distribution of
>> the difference between the estimated covariance matrix
>> of the factors and the actual sample matrix, and the
>> estimated specific variances and the sample values, is
>> largely independent of anything.  

>Cant I do this using AMOS? Also, could you kindly suggest a reference
>for this

I have no idea what AMOS is.

AFAIK, this appears in an abstract of mine in the _Annals
of Mathematical Statistics_, 1955.  If you are familiar 
with the use of the "delta method" to get asymptotic 
results, it should not be difficult to produce the proof.

The idea is this, where X_i is the observation vector 
for the i-th case, F_i the factor vector, and S_i the
specific factor vector.  The model is

        X_i = L*F_i + S_i.

Then the sample covariance matrix satisfies

M_XX = L*M_FF*L' + L*M_FS + M_SF*L + M_SS.

If M_FS and the off-diagonal elements of M_SS were to be
exactly 0, the MLE of L, M_FF, and the specific covariance
matrix would be the actual values of L, M_FF, and M_SS.
If the usual independence assumptions are made, and
these can be weakened to somewhat more than lack of
covariance, sqrt(n)*M_FS and sqrt(n)*(M_SS - diag(M_SS))
are asymptotically normal, and one can then expand the
logarithm of the likelihood function to obtain the errors
as approximately jointly normal multiples of 1/sqrt(n).
-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to