On 29 May 2003 10:36:03 -0700, Ryan Schram <[EMAIL PROTECTED]> wrote:
[... ] > > I am using R-1.7.0 for a logit analysis, as a part of a graduate seminar > in quantitative methods. The rest of the class is using LIMDEP, which (I > think) uses the Berndt, Hall, Hall, and Hausman (BHHH) algorithm for > doing MLE. The glm package in Splus/R uses iteratively reweighted least > squares (IRLS). What's the difference? Specifically, The distinction between MLE and Least Squares, Iteratively Reweighted or not, is something implicit in the lessons of "statistical estimation theory." I don't remember that they taught us a whole lot, "specifically," except that ML estimators do have some ideal properties, asymptotically achieved as N gets large. That might not sound GREAT, but it is a step up from NOT having ideal properties. Also, speaking to the overall popularity (rather than to a particular package): in recent years, people have figured how to computer-program the ML solutions. > > (1) Could someone point me to articles for a more social-science > audience which discuss these algorithms? I'm especially interested in > what conditions call for one or the other, or what are the strength and > weaknesses of each.* So, you see, (1) is a little bit off. There's little reason to be interested in the algorithms, except for the information about their robustness, if they are solving for the same model. If anyone has usable information about robustness of the methods, I'd be interested, too. I found an interesting mention in Finney's 1971 book on Bioassay about Logistic regression when performed with grouped data. By using weighted LS regression, you obtain the "minimum chisquared" solution. [snip, Q about S-plus, etc.] -- Rich Ulrich, [EMAIL PROTECTED] http://www.pitt.edu/~wpilib/index.html . . ================================================================= Instructions for joining and leaving this list, remarks about the problem of INAPPROPRIATE MESSAGES, and archives are available at: . http://jse.stat.ncsu.edu/ . =================================================================
