Re: Normality in Factor Analysis

2001-06-25 Thread Glen Barnett


Robert Ehrlich [EMAIL PROTECTED] wrote in message
[EMAIL PROTECTED]">news:[EMAIL PROTECTED]...
 Calculation of eigenvalues and eigenvalues requires no assumption.
 However evaluation of the results IMHO implicitly assumes at least a
 unimodal distribution and reasonably homogeneous variance for the same
 reasons as ANOVA or regression.  So think of th consequencesof calculating
 means and variances of a strongly bimodal distribution where no sample
 ocurrs near the mean and all samples are tens of standard devatiations
 from the mean.

The largest number of standard deviations all data can be from the mean is 1.

To get some data further away than that, some of it has to be less than 1 s.d.
from the mean.

Glen





=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Normality in Factor Analysis

2001-06-22 Thread Robert Ehrlich

Calculation of eigenvalues and eigenvalues requires no assumption.
However evaluation of the results IMHO implicitly assumes at least a
unimodal distribution and reasonably homogeneous variance for the same
reasons as ANOVA or regression.  So think of th consequencesof calculating
means and variances of a strongly bimodal distribution where no sample
ocurrs near the mean and all samples are tens of standard devatiations
from the mean.

 Hi,

 I have a question regarding factor analysis: Is normality an important
 precondition for using factor analysis?

 If no, are there any books that justify this.



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Maximum likelihood Was: Re: Factor Analysis

2001-06-18 Thread Herman Rubin

In article [EMAIL PROTECTED],
Ken Reed  [EMAIL PROTECTED] wrote:
It's not really possible to explain this in lay person's terms. The
difference between principal factor analysis and common factor analysis is
roughly that PCA uses raw scores, whereas factor analysis uses scores
predicted from the other variables and does not include the residuals.
That's as close to lay terms as I can get.

I have never heard a simple explanation of maximum likelihood estimation,
but --  MLE compares the observed covariance matrix with a  covariance
matrix predicted by probability theory and uses that information to estimate
factor loadings etc that would 'fit' a normal (multivariate) distribution.

MLE factor analysis is commonly used in structural equation modelling, hence
Tracey Continelli's conflation of it with SEM. This is not correct though.

I'd love to hear simple explanation of MLE!

MLE is triviality itself, if you do not make any attempt to
state HOW it is to be carried out.

For each possible value X of the observation, and each state
of nature \theta, there is a probability (or density with 
respect to some base measure) P(X | \theta).  There is no
assumption that X is a single real number; it can be anything;
the same holds about \theta.

What MLE does is to choose the \theta which makes P(X | \theta)
as large as possible.  That is all there is to it.

-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054   FAX: (765)494-0558


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Normality in Factor Analysis

2001-06-17 Thread Herman Rubin

In article 9gg7ht$qa3$[EMAIL PROTECTED],
haytham siala [EMAIL PROTECTED] wrote:
Hi,

I have a question regarding factor analysis: Is normality an important
precondition for using factor analysis?

If no, are there any books that justify this.

Factor analysis is quite robust against non-normality.
The essential factor structure is little affected by it
at all, although the representation may get somewhat
sensitive if data-dependent normalizations are used, such
as using correlations rather than covariances, or forcing
normalization on the covariance matrix of the factors.

Some of this is in my paper with Anderson in the
Proceedings of the Third Berkeley Symposium.  The result
on the asymptotic distribution, not at all difficult to
derive, is in one of my abstracts in _Annals of
Mathematical Statistics_, 1955.  It is basically this:

Suppose the factor model is 

x = \Lambda f + s,

f the common factors and s the specific factors.  Further
suppose that f and s, and also the elements of s, are
uncorrelated, and there is adequate normalization and
smooth identification of the model by the elements of
\Lambda alone.  Now estimate \Lambda, M, the covariance
matrix of f, and S, the diagonal covariance matrix of s.
Assuming the usual assumptions for asymptotic normality of
the sample covariances of the elements of f with s, and of
the pairs of different elements of s, the asymptotic
distribution of the estimates of \Lambda and the SAMPLE
values of M and S from their actual values will have the
expected asymptotic joint normal distribution.  This makes
no assumption about the distribution of M and S about 
their expected values, which is the main place were there
is an effect of normality. 



-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054   FAX: (765)494-0558


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Factor Analysis

2001-06-17 Thread Ken Reed

It's not really possible to explain this in lay person's terms. The
difference between principal factor analysis and common factor analysis is
roughly that PCA uses raw scores, whereas factor analysis uses scores
predicted from the other variables and does not include the residuals.
That's as close to lay terms as I can get.

I have never heard a simple explanation of maximum likelihood estimation,
but --  MLE compares the observed covariance matrix with a  covariance
matrix predicted by probability theory and uses that information to estimate
factor loadings etc that would 'fit' a normal (multivariate) distribution.

MLE factor analysis is commonly used in structural equation modelling, hence
Tracey Continelli's conflation of it with SEM. This is not correct though.

I'd love to hear simple explanation of MLE!



 From: [EMAIL PROTECTED] (Tracey Continelli)
 Organization: http://groups.google.com/
 Newsgroups: sci.stat.consult,sci.stat.edu,sci.stat.math
 Date: 15 Jun 2001 20:26:48 -0700
 Subject: Re: Factor Analysis
 
 Hi there,
 
 would someone please explain in lay person's terms the difference
 betwn.
 principal components, commom factors, and maximum likelihood
 estimation
 procedures for factor analyses?
 
 Should I expect my factors obtained through maximum likelihood
 estimation
 tobe highly correlated?  Why?  When should I use a Maximum likelihood
 estimation procedure, and when should I not use it?
 
 Thanks.
 
 Rita
 
 [EMAIL PROTECTED]
 
 
 Unlike the other methods, maximum likelihood allows you to estimate
 the entire structural model *simultaneously* [i.e., the effects of
 every independent variable upon every dependent variable in your
 model].  Most other methods only permit you to estimate the model in
 pieces, i.e., as a series of regressions whereby you regress every
 dependent variable upon every independent variable that has an arrow
 directly pointing to it.  Moreover, maximum likelihood actually
 provides a statistical test of significance, unlike many other methods
 which only provide generally accepted cut-off points but not an actual
 test of statistical significance.  There are very few cases in which I
 would use anything except a maximum likelihood approach, which you can
 use in either LISREL or if you use SPSS you can add on the module AMOS
 which will do this as well.
 
 
 Tracey



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Factor Analysis

2001-06-16 Thread Alexandre Moura

Dear Haytham,

other issue concern with a measure of the latent construct is the
unidimensionality.  Hair et alli(1998): unidimensionality is an assumption
underlying the calculation of reliability and is demonstraded when
indicators of a construct have acceptable fit on a
single-factor(one-dimensional) model.(...) The use of reliability measures,
such Cronbach´s alpha, does not ensure unidimensionality but instead assumes
it exists. The researcher is encouraged to perform unidimensionality tests
on all multiple-indicator constructs before assessing their reliability.

This reference is very important:

Gerbing, David W., Anderson, James C. An updated paadigm for scale
development incorporating unidimensionality and its assesment.

Best regards,

Alexandre Moura.
P.S. Please accept my apologies for my English mistakes.



- Original Message -
From: haytham siala [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Friday, June 15, 2001 5:40 PM
Subject: Factor Analysis


 Hi,
 I will appreciate if someone can help me with this question: if factors
 extracted from a factor analysis were found to be reliable (using an
 internal consistency test like a Cronbach alpha), can they be used to
 represent a measure of the latent construct? If yes, are there any
 references or books that justify this technique?






 =
 Instructions for joining and leaving this list and remarks about
 the problem of INAPPROPRIATE MESSAGES are available at
   http://jse.stat.ncsu.edu/
 =




=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Factor Analysis

2001-06-16 Thread Alexandre Moura

The complete reference:

Gerbing, David W., Anderson, James C. An updated paradigm for scale
development incorporating unidimensionality and its assesment. Journal of
Marketing Research. Vol. XXV (May 1988).

Alexandre Moura.

- Original Message -
From: Alexandre Moura [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Saturday, June 16, 2001 9:26 AM
Subject: Re: Factor Analysis


 Dear Haytham,

 other issue concern with a measure of the latent construct is the
 unidimensionality.  Hair et alli(1998): unidimensionality is an
assumption
 underlying the calculation of reliability and is demonstraded when
 indicators of a construct have acceptable fit on a
 single-factor(one-dimensional) model.(...) The use of reliability
measures,
 such Cronbach´s alpha, does not ensure unidimensionality but instead
assumes
 it exists. The researcher is encouraged to perform unidimensionality tests
 on all multiple-indicator constructs before assessing their reliability.

 This reference is very important:

 Gerbing, David W., Anderson, James C. An updated paadigm for scale
 development incorporating unidimensionality and its assesment.

 Best regards,

 Alexandre Moura.
 P.S. Please accept my apologies for my English mistakes.



 - Original Message -
 From: haytham siala [EMAIL PROTECTED]
 To: [EMAIL PROTECTED]
 Sent: Friday, June 15, 2001 5:40 PM
 Subject: Factor Analysis


  Hi,
  I will appreciate if someone can help me with this question: if factors
  extracted from a factor analysis were found to be reliable (using an
  internal consistency test like a Cronbach alpha), can they be used to
  represent a measure of the latent construct? If yes, are there any
  references or books that justify this technique?
 
 
 
 
 
 
  =
  Instructions for joining and leaving this list and remarks about
  the problem of INAPPROPRIATE MESSAGES are available at
http://jse.stat.ncsu.edu/
  =




 =
 Instructions for joining and leaving this list and remarks about
 the problem of INAPPROPRIATE MESSAGES are available at
   http://jse.stat.ncsu.edu/
 =



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Normality in Factor Analysis

2001-06-16 Thread haytham siala

Hi,

I have a question regarding factor analysis: Is normality an important
precondition for using factor analysis?

If no, are there any books that justify this.




=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Normality in Factor Analysis

2001-06-16 Thread Eric Bohlman

In sci.stat.consult haytham siala [EMAIL PROTECTED] wrote:
 I have a question regarding factor analysis: Is normality an important
 precondition for using factor analysis?

It's necessary for testing hypotheses about factors extracted by 
Joreskog's maximum-likelihood method.  Otherwise, no.

 If no, are there any books that justify this.

Any book on factor analysis or multivariate statistics in general.



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Factor Analysis

2001-06-15 Thread Tracey Continelli

Hi there,

would someone please explain in lay person's terms the difference
betwn.
principal components, commom factors, and maximum likelihood
estimation
procedures for factor analyses?

Should I expect my factors obtained through maximum likelihood
estimation
tobe highly correlated?  Why?  When should I use a Maximum likelihood
estimation procedure, and when should I not use it?

Thanks.

Rita

[EMAIL PROTECTED]


Unlike the other methods, maximum likelihood allows you to estimate
the entire structural model *simultaneously* [i.e., the effects of
every independent variable upon every dependent variable in your
model].  Most other methods only permit you to estimate the model in
pieces, i.e., as a series of regressions whereby you regress every
dependent variable upon every independent variable that has an arrow
directly pointing to it.  Moreover, maximum likelihood actually
provides a statistical test of significance, unlike many other methods
which only provide generally accepted cut-off points but not an actual
test of statistical significance.  There are very few cases in which I
would use anything except a maximum likelihood approach, which you can
use in either LISREL or if you use SPSS you can add on the module AMOS
which will do this as well.


Tracey


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Factor Analysis

2001-06-15 Thread Timothy W. Victor

_Psychometric Theory_, by Jum Nunnally to name one.

haytham siala wrote:
 
 Hi,
 I will appreciate if someone can help me with this question: if factors
 extracted from a factor analysis were found to be reliable (using an
 internal consistency test like a Cronbach alpha), can they be used to
 represent a measure of the latent construct? If yes, are there any
 references or books that justify this technique?

-- 
Timothy Victor
[EMAIL PROTECTED]
Policy Research, Evaluation, and Measurement
Graduate School of Education
University of Pennsylvania


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: factor analysis of dichotomous variables

2001-05-01 Thread John Uebersax

A list of such programs and discussion can be found at:

http://ourworld.compuserve.com/homepages/jsuebersax/binary.htm

The results of Knol  Berger (1991) and Parry  MacArdle (1991) 
(see above web page for citations) suggest that there is not much 
difference in results between the Muthen method and the simpler 
method of factoring tetrachoric correlations.  For additional 
information (including examples using PRELIS/LISREL and SAS) on 
factoring tetrachorics, see

http://ourworld.compuserve.com/homepages/jsuebersax/irt.htm 

Hope this helps.

John Uebersax


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



PCA and factor analysis: when to use which

2001-04-18 Thread Ken Reed

What is the basis for deciding when to use principal components analysis and
when to use factor analysis. Could anyone describe a problem that
illustrates the difference?



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=