I think that this is the link in the code: http://leon.bottou.org/research/stochastic
On Mon, Jan 13, 2014 at 11:58 AM, Frank Scholten <[email protected]>wrote: > Do you know which paper it is? He has quite a few publications. I don't see > any mention of one of his papers in the code. I only see > www.eecs.tufts.edu/~dsculley/papers/combined-ranking-and-regression.pdf in > MixedGradient but this is something different. > > > > On Mon, Jan 13, 2014 at 1:27 PM, Suneel Marthi <[email protected] > >wrote: > > > Mahout's impl is based off of Leon Bottou's paper on this subject. I > > don't gave the link handy but it's referenced in the code or try google > > search > > > > Sent from my iPhone > > > > > On Jan 13, 2014, at 7:14 AM, Frank Scholten <[email protected]> > > wrote: > > > > > > Hi, > > > > > > I followed the Coursera Machine Learning course quite a while ago and I > > am > > > trying to find out how Mahout implements the Logistic Regression cost > > > function in the code surrounding AbstractOnlineLogisticRegression. > > > > > > I am looking at the train method in AbstractOnlineLogisticRegression > and > > I > > > see online gradient descent step where the beta matrix is updated but > to > > me > > > its unclear how matches with the cost function described at: > > > http://www.holehouse.org/mlclass/06_Logistic_Regression.html > > > > > > Perhaps Mahout uses an optimized approach for that does not directly > map > > > into the formula at that link? > > > > > > Cheers, > > > > > > Frank > > >
