Github user WeichenXu123 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18992#discussion_r134109929
  
    --- Diff: 
mllib/src/main/scala/org/apache/spark/ml/optim/loss/DifferentiableRegularization.scala
 ---
    @@ -57,6 +61,11 @@ private[ml] class L2Regularization(
               val coef = coefficients(j)
               applyFeaturesStd match {
                 case Some(getStd) =>
    +              // If `standardization` is false, we still standardize the 
data
    +              // to improve the rate of convergence; as a result, we have 
to
    +              // perform this reverse standardization by penalizing each 
component
    +              // differently to get effectively the same objective 
function when
    +              // the training dataset is not standardized.
    --- End diff --
    
    Yes this is an important comment we need keep.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to