Github user jkbradley commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6504#discussion_r34312376
  
    --- Diff: docs/ml-guide.md ---
    @@ -157,6 +174,118 @@ There are now several algorithms in the Pipelines API 
which are not in the lower
     * [Feature Extraction, Transformation, and Selection](ml-features.html)
     * [Ensembles](ml-ensembles.html)
     
    +## Linear Methods with Elastic Net Regularization
    +
    +In MLlib, we implement popular linear methods such as logistic regression 
and linear least squares with L1 or L2 regularization. Refer to [the linear 
methods section](mllib-linear-methods.html) for details. In `spark.ml`, we also 
include Pipelines API for [Elastic 
net](http://en.wikipedia.org/wiki/Elastic_net_regularization), a hybrid of L1 
and L2 regularization proposed in [this 
paper](http://users.stat.umn.edu/~zouxx019/Papers/elasticnet.pdf). 
Mathematically it is defined as a linear combination of the L1-norm and the 
L2-norm:
    +`\[
    +\alpha \|\wv\|_1 + (1-\alpha) \frac{1}{2}\|\wv\|_2^2, \alpha \in [0, 1].
    +\]`
    +By setting $\alpha$ properly, it contains both L1 and L2 regularization as 
special cases. For example, if a [linear 
regression](/api/scala/index.html#org.apache.spark.ml.regression.LinearRegression)
 model is trained with the elastic net parameter $\alpha$ set to $1$, it is 
equivalent to a 
[Lasso](http://en.wikipedia.org/wiki/Least_squares#Lasso_method) model. On the 
other hand, if $\alpha$ is set to $0$, the trained model reduces to a [ridge 
regression](http://en.wikipedia.org/wiki/Tikhonov_regularization) model. We 
implement Pipelines API for both linear regression and logistic regression with 
elastic net regularization.
    +
    +**Examples**
    +
    +<div class="codetabs">
    +
    +<div data-lang="scala" markdown="1">
    +The following code illustrates how to load a sample dataset and use 
logistic regression with elastic net regularization to fit a model.
    +
    +{% highlight scala %}
    +
    +import org.apache.spark.ml.classification.LogisticRegression
    +import org.apache.spark.mllib.util.MLUtils
    +
    +// Load training data
    +val training = MLUtils.loadLibSVMFile(sc, 
"data/mllib/sample_libsvm_data.txt").toDF()
    +
    +val lr = new LogisticRegression()
    +  .setRegParam(0.3)
    +  .setElasticNetParam(0.8)
    +  .setTol(1e-6)
    +
    +// Fit the model
    +val lrModel = lr.fit(training)
    +
    +// Print the weights and intercept for logistic regression
    +println(s"Weights: ${lrModel.weights} Intercept: ${lrModel.intercept}")
    +
    +{% endhighlight %}
    +
    +</div>
    +
    +<div data-lang="java" markdown="1">
    +The following code illustrates how to load a sample dataset and use 
logistic regression with elastic net regularization to fit a model.
    +
    +{% highlight java %}
    +
    +import org.apache.spark.ml.classification.LogisticRegression;
    +import org.apache.spark.ml.classification.LogisticRegressionModel;
    +import org.apache.spark.mllib.regression.LabeledPoint;
    +import org.apache.spark.mllib.util.MLUtils;
    +import org.apache.spark.SparkConf;
    +import org.apache.spark.SparkContext;
    +import org.apache.spark.sql.DataFrame;
    +import org.apache.spark.sql.SQLContext;
    +
    +public class LogisticRegressionWithElasticNetExample {
    +  public static void main(String[] args) {
    +    SparkConf conf = new SparkConf()
    +      .setAppName("Logistic Regression with Elastic Net Example");
    +
    +    SparkContext sc = new SparkContext(conf);
    +    SQLContext sql = new SQLContext(sc);
    +    String path = "sample_libsvm_data.txt";
    +
    +    // Load training data
    +    DataFrame training = sql.createDataFrame(MLUtils.loadLibSVMFile(sc, 
path).toJavaRDD(), LabeledPoint.class);
    +
    +    LogisticRegression lr = new LogisticRegression()
    +      .setMaxIter(10)
    +      .setRegParam(0.3)
    +      .setElasticNetParam(0.8)
    +      .setThreshold(0.6)
    +      .setProbabilityCol("myProbability");
    +
    +    // Fit the model
    +    LogisticRegressionModel lrModel = lr.fit(training);
    +
    +    // Print the weights and intercept for logistic regression
    +    System.out.println("Weights: " + lrModel.weights() + " Intercept: " + 
lrModel.intercept());
    +  }
    +}
    +{% endhighlight %}
    +</div>
    +
    +<div data-lang="python" markdown="1">
    +The following code illustrates how to load a sample dataset and use 
logistic regression with elastic net regularization to fit a model.
    +
    +{% highlight python %}
    +
    +from pyspark.ml.classification import LogisticRegression
    +from pyspark.mllib.regression import LabeledPoint
    +from pyspark.mllib.util import MLUtils
    +
    +# Load training data
    +training = MLUtils.loadLibSVMFile(sc, 
"data/mllib/sample_libsvm_data.txt").toDF()
    +
    +lr = LogisticRegression(maxIter=10, regParam=0.3)
    --- End diff --
    
    Let's be Pythonic and set the elasticNetParam via an argument, like the 
other params.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to