Re: Reducing feature sets with cross-entropy

2001-07-12 Thread Ken Williams
[EMAIL PROTECTED] (Tom Fawcett) wrote: >Just a casual comment on this. There has been a fair amount of work on text >classification in the past few years, comparing different representations and >algorithms. I wouldn't take any individual study's conclusions as definitive, >since various papers

Re: Reducing feature sets with cross-entropy

2001-07-12 Thread Tom Fawcett
Ken Williams <[EMAIL PROTECTED]> wrote: > I had a chance last week to read Yiming Yang's paper on feature set > reduction: > > http://www.cs.cmu.edu/~yiming/papers.yy/icml97.ps.gz > > It contains the startling conclusion that the single biggest factor in > getting good results by reducing fea

Re: Reducing feature sets with cross-entropy

2001-07-11 Thread Ken Williams
I had a chance last week to read Yiming Yang's paper on feature set reduction: http://www.cs.cmu.edu/~yiming/papers.yy/icml97.ps.gz It contains the startling conclusion that the single biggest factor in getting good results by reducing feature sets is to keep frequently-used features (after ge

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread Nathan Torkington
> > ...speaking of which, is anyone familiar with Thomas M. Mitchell's book > > "Machine Learning"? It has only positive reviews on Amazon, but I'm not > > sure whether that's reliable. I have the book, and really really like it. I found it comprehensible and useful. Nat

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread lenzo
It's a very good book, and Tom is a good teacher, too. He's here at CMU. kevin On Fri, Jul 06, 2001 at 11:30:54AM -0500, Ken Williams wrote: > [EMAIL PROTECTED] (Ken Williams) wrote: > >You're right that there are a lot of resources to be found in a web > >search, but most of it is about very sp

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread Lee Jones
I was in Tom Mitchell's graduate machine learning class at CMU when he was writing the book. We were working off of 'chapters in progress', but at the time I thought what was there was great. It hits most of the major topics in machine learning and gives algorithmic outlines of things in pseudo

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread Probonas Vasilis
I have personally tried this book as an introduction to Machine learning theory. It is a good book and I think the author has set up a web site with matterial supplementing his book. Since it is a matter of personal 'taste' just take an opportunity to browse this book if your local library hol

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread Ken Williams
[EMAIL PROTECTED] (Ken Williams) wrote: >You're right that there are a lot of resources to be found in a web >search, but most of it is about very specific applications - perhaps >introductory material is best found in a textbook. ...speaking of which, is anyone familiar with Thomas M. Mitchell's

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread Ken Williams
[EMAIL PROTECTED] (John Porter) wrote: >Ken Williams wrote: >> one suggestion was to use cross-entropy >> measurements to reduce the number of features (words) considered. > >Um, have you tried a web search? Seems to me there's a fair >amount of info out there... At YAPC, it was decided that thi

Re: Reducing feature sets with cross-entropy

2001-07-06 Thread John Porter
Ken Williams wrote: > one suggestion was to use cross-entropy > measurements to reduce the number of features (words) considered. Um, have you tried a web search? Seems to me there's a fair amount of info out there... -- John Porter

RE: Reducing feature sets with cross-entropy

2001-07-06 Thread Lee Goddard
What do you mean by 'cross-entity reference'? Could you be more explicit? Lee --- Obligatory perl schmutter .sig: perl -e "print chr(rand>.5?92:47) while 1" > -Original Message- > From: Ken Williams [mailto:[EMAIL PROTECTED]] > Sent: 06 July 2001 05:29 > To: [EMAIL PROTECTED] > Subject: