On Sun, 14 Sep 2003 08:17:20 +0100 (BST) Prof Brian Ripley <[EMAIL PROTECTED]> wrote:
> On Sat, 13 Sep 2003, Trevor Hastie wrote: > > > Christoph Lehman had problems with seperated data in two-class logistic > > regression. > > > > One useful little trick is to penalize the logistic regression using a > > quadratic penalty on the coefficients. I am sure there are functions in > > the R contributed libraries to do this; > > Using nnet/multinom with weight decay does exactly this. Also the lrm function in the Design package will do quadratic penalization. Frank Harrell > > > otherwise it is easy to achieve via IRLS using ridge regressions. Then > > even though the data are separated, the penalized log-likelihood has a > > unique maximum. One intriguing feature is that as the penalty parameter > > goes to zero, the solution converges to the SVM solution - i.e. the > > optimal separating hyperplane see > > http://www-stat.stanford.edu/~hastie/Papers/margmax1.ps > > > -- > Brian D. Ripley, [EMAIL PROTECTED] > Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ > University of Oxford, Tel: +44 1865 272861 (self) > 1 South Parks Road, +44 1865 272866 (PA) > Oxford OX1 3TG, UK Fax: +44 1865 272595 > > ______________________________________________ > [EMAIL PROTECTED] mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-help --- Frank E Harrell Jr Professor and Chair School of Medicine Department of Biostatistics Vanderbilt University ______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help
