[EMAIL PROTECTED] (wuzzy) wrote, in part: > What I was trying to do (probably failed) is a bayesian logistic > regression.
OK, thanks for the clarification; I didn't understand what was the goal before. It is well known that the parameters for a logistic regression may not be well defined by the available training data. The right thing to do, from a Bayesian p.o.v., is to average output from the logistic regression model over the posterior distribution of the parameters. It can be this simple: a <- (samples from posterior); y <- (logistic regression output for given input and parameters a); average(y). If you bear this is mind, I think it will make the task comprehensible. There is a worked example involving logistic regression in Bayesian Data Analysis, by Gelman, Carlin, Stern, and Rubin, somewhere around page 82 (first edition). For what it's worth, Robert Dodier -- Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise. -- John W. Tukey . . ================================================================= Instructions for joining and leaving this list, remarks about the problem of INAPPROPRIATE MESSAGES, and archives are available at: . http://jse.stat.ncsu.edu/ . =================================================================
