Github user feynmanliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/8518#discussion_r38265176
--- Diff: docs/ml-guide.md ---
@@ -186,26 +186,18 @@ This example covers the concepts of `Estimator`,
`Transformer`, and `Param`.
<div data-lang="scala">
{% highlight scala %}
-import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.param.ParamMap
import org.apache.spark.mllib.linalg.{Vector, Vectors}
-import org.apache.spark.mllib.regression.LabeledPoint
-import org.apache.spark.sql.{Row, SQLContext}
-
-val conf = new SparkConf().setAppName("SimpleParamsExample")
-val sc = new SparkContext(conf)
-val sqlContext = new SQLContext(sc)
-import sqlContext.implicits._
+import org.apache.spark.sql.Row
// Prepare training data.
-// We use LabeledPoint, which is a case class. Spark SQL can convert RDDs
of case classes
-// into DataFrames, where it uses the case class metadata to infer the
schema.
-val training = sc.parallelize(Seq(
- LabeledPoint(1.0, Vectors.dense(0.0, 1.1, 0.1)),
- LabeledPoint(0.0, Vectors.dense(2.0, 1.0, -1.0)),
- LabeledPoint(0.0, Vectors.dense(2.0, 1.3, 1.0)),
- LabeledPoint(1.0, Vectors.dense(0.0, 1.2, -0.5))))
+val training = sqlContext.createDataFrame(Seq(
+ (1.0, Vectors.dense(0.0, 1.1, 0.1)),
+ (0.0, Vectors.dense(2.0, 1.0, -1.0)),
+ (0.0, Vectors.dense(2.0, 1.3, 1.0)),
+ (1.0, Vectors.dense(0.0, 1.2, -0.5))
+)).toDF("label", "features")
--- End diff --
+1 to switching from reflections-based schema inference into explicitly
naming columns; should we consider deprecating `LabeledPoint` since a big use
case is schema inference?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]