Correct link. 

Simulated annealing is still used, but this isn't a particularly good 
application of it. 

Sent from my iPhone

> On May 22, 2014, at 13:25, Dmitriy Lyubimov <[email protected]> wrote:
> 
> i think it is actually a mix.
> 
> yes SGD, but there's also online validation of hyperparameters via
> step-recorded search. Hope my citation is correct, i started forgetting
> things. [1]
> 
> AFAIK simulated annealing approach has been abandoned in favor of [1]
> 
> [1]
> http://www.researchgate.net/publication/1916718_Recorded_Step_Directional_Mutation_for_Faster_Convergence
> 
> 
>> On Wed, May 21, 2014 at 11:44 PM, Peng Zhang <[email protected]> wrote:
>> 
>> Namit,
>> 
>> I think the theory behind Mahout’s logistic regression is stochastic
>> gradient descent, rather than maximum likelihood.
>> 
>> Best Regards,
>> Peng Zhang
>> 
>> 
>> 
>> On May 22, 2014, at 2:29 PM, namit maheshwari <[email protected]>
>> wrote:
>> 
>>> Hello Everyone,
>>> 
>>> Could anyone please let me know the algorithm used behind
>>> LogisticRegression in Mahout. Also AdaptiveLogisticRegression mentions an
>>> *annealing* schedule.
>>> 
>>> I would be grateful if someone could guide me towards the theory behind
>> it.
>>> 
>>> Thanks
>>> Namit
>> 
>> 

Reply via email to