Hi,

I followed the Coursera Machine Learning course quite a while ago and I am
trying to find out how Mahout implements the Logistic Regression cost
function in the code surrounding AbstractOnlineLogisticRegression.

I am looking at the train method in AbstractOnlineLogisticRegression and I
see online gradient descent step where the beta matrix is updated but to me
its unclear how matches with the cost function described at:
http://www.holehouse.org/mlclass/06_Logistic_Regression.html

Perhaps Mahout uses an optimized approach for that does not directly map
into the formula at that link?

Cheers,

Frank

Reply via email to