Mahout's impl is based off of Leon Bottou's paper on this subject.  I don't 
gave the link handy but it's referenced in the code or try google search

Sent from my iPhone

> On Jan 13, 2014, at 7:14 AM, Frank Scholten <[email protected]> wrote:
> 
> Hi,
> 
> I followed the Coursera Machine Learning course quite a while ago and I am
> trying to find out how Mahout implements the Logistic Regression cost
> function in the code surrounding AbstractOnlineLogisticRegression.
> 
> I am looking at the train method in AbstractOnlineLogisticRegression and I
> see online gradient descent step where the beta matrix is updated but to me
> its unclear how matches with the cost function described at:
> http://www.holehouse.org/mlclass/06_Logistic_Regression.html
> 
> Perhaps Mahout uses an optimized approach for that does not directly map
> into the formula at that link?
> 
> Cheers,
> 
> Frank

Reply via email to