"Rich Ulrich" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> On 29 May 2003 10:36:03 -0700, Ryan Schram <[EMAIL PROTECTED]>
> wrote:
>
> [... ]
> >
> > I am using R-1.7.0 for a logit analysis, as a part of a graduate seminar
> > in quantitative methods. The rest of the class is using LIMDEP, which (I
> > think) uses the Berndt, Hall, Hall, and Hausman (BHHH) algorithm for
> > doing MLE. The glm package in Splus/R uses iteratively reweighted least
> > squares (IRLS). What's the difference? Specifically,
>
> The distinction between MLE  and Least Squares,
> Iteratively Reweighted or not, is something implicit in
> the lessons of  "statistical estimation theory."  I don't
> remember that they taught us a whole lot, "specifically,"
> except that ML estimators do have some ideal
> properties, asymptotically achieved as N gets large.
> That might not sound GREAT, but it is a step up from
> NOT  having ideal properties.
>
> Also, speaking to the overall popularity (rather than to
> a particular package):   in recent years, people have
> figured how to computer-program the ML solutions.
>
> >
> > (1) Could someone point me to articles for a more social-science
> > audience which discuss these algorithms? I'm especially interested in
> > what conditions call for one or the other, or what are the strength and
> > weaknesses of each.*
>
> So, you see, (1)  is a little bit off.  There's little reason
> to be interested in the algorithms, except for the information
> about their robustness, if they are solving for the same model.
>
> If anyone has usable information about robustness of
> the methods, I'd be interested, too.
>
> I found an interesting mention in Finney's 1971 book on
> Bioassay about Logistic regression when performed with
> grouped data.  By using  weighted LS regression, you obtain
> the "minimum chisquared"  solution.
>
> [snip, Q  about S-plus, etc.]
>
> -- 
> Rich Ulrich, [EMAIL PROTECTED]
> http://www.pitt.edu/~wpilib/index.html
----------------------------------------------------------------------------
----------
A. W. F. Edwards  put together an interesting little book "Likelihood"
(1972) that rather simply goes into the philosophic problems of "Maximum
Likelihood" versus "Least Squares" In essence, "the twain shall not meet".
Fisher was bothered by the "Maximum Likelihood" method, because the p value
did not follow all the laws of probability. Nevertheless, "Maximum
Likelihood" is the method of research where data covariance structures are
built and compared to theoretical models. Maximum likelihood also is the
basis for data imputation methods. But all us simple statisticians just go
ahead with least squares and are happy with the results.

David Heiser


.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to