[
https://issues.apache.org/jira/browse/SPARK-1673?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14339062#comment-14339062
]
Joseph K. Bradley commented on SPARK-1673:
------------------------------------------
Some thoughts:
{quote}
Friedman says in his paper that they found problems where glmnet would generate
the entire coefficient path more rapidly than sophisticated single point
methods would generate single point solutions
{quote}
This is true, but it's actually often even better to use an approximate path
instead of an exact path (which glmnet uses). There is a lot of literature
discussing "continuation," "warm-starts," "approximate regularization paths,"
and "homotopy" (which is sometimes overloaded to mean approximate homotopy). I
worry about glmnet doing a lot of iterations, whereas analogous but approximate
methods could make larger jumps along the regularization path.
Continuation (following an approximate regularization path) can actually be
used as a wrapper around a lot of optimization algorithms to speed them up;
I've used it successfully with coordinate descent, accelerated gradient, and
others. I haven't tried it with OWL-QN. It might be interesting to explore a
general continuation wrapper. Some of the other benefits you mention apply to
any algorithm wrapped with continuation (e.g., automatically choosing a
starting point for the penalty parameter).
> GLMNET implementation in Spark
> ------------------------------
>
> Key: SPARK-1673
> URL: https://issues.apache.org/jira/browse/SPARK-1673
> Project: Spark
> Issue Type: New Feature
> Components: MLlib
> Reporter: Sung Chung
>
> This is a Spark implementation of GLMNET by Jerome Friedman, Trevor Hastie,
> Rob Tibshirani.
> http://www.jstatsoft.org/v33/i01/paper
> It's a straightforward implementation of the Coordinate-Descent based L1/L2
> regularized linear models, including Linear/Logistic/Multinomial regressions.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]