[
https://issues.apache.org/jira/browse/SPARK-17136?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15429039#comment-15429039
]
DB Tsai commented on SPARK-17136:
---------------------------------
Typically, the first order optimizer will take a function which returns the
first derivative and the value of objective function. Second order one will
take the hessian matrix. Since second order one doesn't scale in number of
features, we can focus on first order optimizer first. Also, we need to have a
interface to handle non differentiable loss which is L1 outside the return,
since it's specific to the design of algorithms so can not be part of the loss.
We may take a look on how other packages in R, matlab, or python defining the
interfaces, and come out with generic one. The default implementation can wrap
the breeze one. Users can have their own implementation to change the default
optimizer.
> Design optimizer interface for ML algorithms
> --------------------------------------------
>
> Key: SPARK-17136
> URL: https://issues.apache.org/jira/browse/SPARK-17136
> Project: Spark
> Issue Type: Sub-task
> Components: ML
> Reporter: Seth Hendrickson
>
> We should consider designing an interface that allows users to use their own
> optimizers in some of the ML algorithms, similar to MLlib.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]