Github user debasish83 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17862#discussion_r115659479
  
    --- Diff: 
mllib/src/test/scala/org/apache/spark/ml/classification/LinearSVCSuite.scala ---
    @@ -154,22 +159,23 @@ class LinearSVCSuite extends SparkFunSuite with 
MLlibTestSparkContext with Defau
     
       test("linearSVC with sample weights") {
         def modelEquals(m1: LinearSVCModel, m2: LinearSVCModel): Unit = {
    -      assert(m1.coefficients ~== m2.coefficients absTol 0.05)
    +      assert(m1.coefficients ~== m2.coefficients absTol 0.07)
           assert(m1.intercept ~== m2.intercept absTol 0.05)
         }
    -
    -    val estimator = new LinearSVC().setRegParam(0.01).setTol(0.01)
    -    val dataset = smallBinaryDataset
    -    MLTestingUtils.testArbitrarilyScaledWeights[LinearSVCModel, LinearSVC](
    -      dataset.as[LabeledPoint], estimator, modelEquals)
    -    MLTestingUtils.testOutliersWithSmallWeights[LinearSVCModel, LinearSVC](
    -      dataset.as[LabeledPoint], estimator, 2, modelEquals, outlierRatio = 
3)
    -    MLTestingUtils.testOversamplingVsWeighting[LinearSVCModel, LinearSVC](
    -      dataset.as[LabeledPoint], estimator, modelEquals, 42L)
    +    LinearSVC.supportedOptimizers.foreach { opt =>
    +      val estimator = new 
LinearSVC().setRegParam(0.02).setTol(0.01).setSolver(opt)
    +      val dataset = smallBinaryDataset
    +      MLTestingUtils.testArbitrarilyScaledWeights[LinearSVCModel, 
LinearSVC](
    +        dataset.as[LabeledPoint], estimator, modelEquals)
    +      MLTestingUtils.testOutliersWithSmallWeights[LinearSVCModel, 
LinearSVC](
    +        dataset.as[LabeledPoint], estimator, 2, modelEquals, outlierRatio 
= 3)
    +      MLTestingUtils.testOversamplingVsWeighting[LinearSVCModel, 
LinearSVC](
    +        dataset.as[LabeledPoint], estimator, modelEquals, 42L)
    +    }
       }
     
    -  test("linearSVC comparison with R e1071 and scikit-learn") {
    -    val trainer1 = new LinearSVC()
    +  test("linearSVC OWLQN comparison with R e1071 and scikit-learn") {
    +    val trainer1 = new LinearSVC().setSolver(LinearSVC.OWLQN)
           .setRegParam(0.00002) // set regParam = 2.0 / datasize / c
    --- End diff --
    
    This slides also explain it...Please see slide 32...the max can be replaced 
by soft-max with the softness lambda can be tuned...log-sum-exp is a standard 
soft-max that can be used which is similar to ReLu functions and we can re-use 
it from MLP:
    ftp://ftp.cs.wisc.edu/math-prog/talks/informs99ssv.ps
    ftp://ftp.cs.wisc.edu/pub/dmi/tech-reports/99-03.pdf
    I can add the formulation if there is interest...it needs some tuning for 
soft-max parameter but the convergence will be good with LBFGS (OWLQN is not 
needed)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to