Github user yanboliang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17862#discussion_r115278918
  
    --- Diff: 
mllib/src/main/scala/org/apache/spark/ml/classification/LinearSVC.scala ---
    @@ -205,15 +233,21 @@ class LinearSVC @Since("2.2.0") (
           val costFun = new LinearSVCCostFun(instances, $(fitIntercept),
             $(standardization), bcFeaturesStd, regParamL2, $(aggregationDepth))
     
    -      def regParamL1Fun = (index: Int) => 0D
    -      val optimizer = new BreezeOWLQN[Int, BDV[Double]]($(maxIter), 10, 
regParamL1Fun, $(tol))
    +      val optimizerAlgo = $(optimizer) match {
    --- End diff --
    
    I saw other libraries like ```sklearn.svm.linearSVC``` use squared hinge 
loss as the default loss function, it's differentiable and can be solved by 
LBFGS. What about merging #17645 firstly and then we can use LBFGS as the 
default solver as well? Otherwise, we would make behavior change if we make 
this switch after 2.2 release. cc @jkbradley 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to