[
https://issues.apache.org/jira/browse/SPARK-1503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14227023#comment-14227023
]
Reza Zadeh commented on SPARK-1503:
-----------------------------------
Thanks Aaron. From an implementation perspective, it's probably easier to
implement a constant step size first. From there you can see if there is any
finicky behavior and compare to the unaccelerated proximal gradient already in
Spark. If it works well enough, we should commit the first PR without
backtracking, and then experiment with backtracking, otherwise if you see
strange behavior then you can decide if backtracking would solve it.
> Implement Nesterov's accelerated first-order method
> ---------------------------------------------------
>
> Key: SPARK-1503
> URL: https://issues.apache.org/jira/browse/SPARK-1503
> Project: Spark
> Issue Type: New Feature
> Components: MLlib
> Reporter: Xiangrui Meng
> Assignee: Aaron Staple
>
> Nesterov's accelerated first-order method is a drop-in replacement for
> steepest descent but it converges much faster. We should implement this
> method and compare its performance with existing algorithms, including SGD
> and L-BFGS.
> TFOCS (http://cvxr.com/tfocs/) is a reference implementation of Nesterov's
> method and its variants on composite objectives.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]