[ 
https://issues.apache.org/jira/browse/FLINK-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15286469#comment-15286469
 ] 

ASF GitHub Bot commented on FLINK-1979:
---------------------------------------

Github user tillrohrmann commented on a diff in the pull request:

    https://github.com/apache/flink/pull/1985#discussion_r63509731
  
    --- Diff: 
flink-libraries/flink-ml/src/main/scala/org/apache/flink/ml/optimization/LossFunction.scala
 ---
    @@ -23,8 +23,8 @@ import org.apache.flink.ml.math.BLAS
     
     /** Abstract class that implements some of the functionality for common 
loss functions
       *
    -  * A loss function determines the loss term $L(w) of the objective 
function  $f(w) = L(w) +
    -  * \lambda R(w)$ for prediction tasks, the other being regularization, 
$R(w)$.
    +  * A loss function determines the loss term `L(w)` of the objective 
function  `f(w) = L(w) +
    +  * lambda*R(w)` for prediction tasks, the other being regularization, 
`R(w)`.
    --- End diff --
    
    Good catch :-)


> Implement Loss Functions
> ------------------------
>
>                 Key: FLINK-1979
>                 URL: https://issues.apache.org/jira/browse/FLINK-1979
>             Project: Flink
>          Issue Type: Improvement
>          Components: Machine Learning Library
>            Reporter: Johannes Günther
>            Assignee: Johannes Günther
>            Priority: Minor
>              Labels: ML
>
> For convex optimization problems, optimizer methods like SGD rely on a 
> pluggable implementation of a loss function and its first derivative.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to