Leave-one-out is very inaccurate for some methods, notably trees, but fine for some others (e.g. LDA) if used with a good measure of accuracy.
Hint: there is a very large literature on this, so read any good book on classification to find out what is known. On Tue, 6 Jan 2004, Christoph Lehmann wrote: > Hi > what would you recommend to compare classification methods such as LDA, > classification trees (rpart), bagging, SVM, etc: > > 10-fold cv (as in Ripley p. 346f) Not a valid reference: did you mean Venables & Ripley (2000, p.346f)? Try reading Ripley (1996), for example. > or > > leaving-one-out (as e.g. implemented in LDA)? > > my data-set is not that huge (roughly 200 entries) That's rather small to compare error rates on. -- Brian D. Ripley, [EMAIL PROTECTED] Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595 ______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
