Thanks Ted, Ill read through that...
On Jun 21, 2011, at 4:17 PM, Ted Dunning wrote: > Chapter 17 in MiA has a decent description of this method. > > On Wed, Jun 22, 2011 at 1:17 AM, Ted Dunning <[email protected]> wrote: > >> You are right that sounds crazy. >> >> What I did was to model the target variable click trying to predict it with >> user features, item features and user x item interaction features. >> >> >> On Wed, Jun 22, 2011 at 1:10 AM, Chris Schilling <[email protected]>wrote: >> >>> Hey Ted, >>> >>> I was wondering if you could briefly describe how one would make content >>> based recommendations using the SGD classifiers. >>> >>> Say I have item1: feature1a, feature1b, feature1c >>> and item2: feature2b, feature2c >>> >>> So, are you training a classifier for n labels, where n is the number of >>> items? That seems crazy cause you only have one feature vector per item. >>> >>> >>> On Jun 21, 2011, at 3:49 PM, Ted Dunning wrote: >>> >>>> I have used the SGD classifiers for content based recommendation. It >>> works >>>> out reasonably but the interaction variables can get kind of expensive. >>>> >>>> Doing it again, I think I would use latent factor log linear models to >>> do >>>> the interaction features. See >>>> http://cseweb.ucsd.edu/~akmenon/LFL-ICDM10.pdf >>>> >>>> We have a half done implementation in Mahout. There was a student at >>> UCSD >>>> looking into completing it, but we don't have real results yet. >>>> >>>> On Wed, Jun 22, 2011 at 12:34 AM, Marko Ciric <[email protected]> >>> wrote: >>>> >>>>> Hi guys, >>>>> >>>>> When trying to do a content-based recommender, there could be two >>>>> approaches >>>>> with Apache Mahout: >>>>> >>>>> - Having a custom implemented Taste ItemSimilarity that is calculated >>>>> with item features. >>>>> - Classifying a data set with Mahout by representing items with >>> vectors. >>>>> >>>>> Has anybody had the experience with comparing performance/accuracy of >>>>> those? >>>>> >>>>> Thanks >>>>> >>>>> -- >>>>> Marko Ćirić >>>>> [email protected] >>>>> >>> >>> >>
