Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13592#discussion_r67543099
  
    --- Diff: docs/sql-programming-guide.md ---
    @@ -517,24 +517,26 @@ types such as Sequences or Arrays. This RDD can be 
implicitly converted to a Dat
     registered as a table. Tables can be used in subsequent SQL statements.
     
     {% highlight scala %}
    -// sc is an existing SparkContext.
    -val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    +val spark: SparkSession // An existing SparkSession
     // this is used to implicitly convert an RDD to a DataFrame.
    -import sqlContext.implicits._
    +import spark.implicits._
     
     // Define the schema using a case class.
     // Note: Case classes in Scala 2.10 can support only up to 22 fields. To 
work around this limit,
     // you can use custom classes that implement the Product interface.
     case class Person(name: String, age: Int)
     
    -// Create an RDD of Person objects and register it as a table.
    -val people = 
sc.textFile("examples/src/main/resources/people.txt").map(_.split(",")).map(p 
=> Person(p(0), p(1).trim.toInt)).toDF()
    +// Create an RDD of Person objects and register it as a temporary view.
    +val people = sc
    +  .textFile("examples/src/main/resources/people.txt")
    +  .map(_.split(","))
    +  .map(p => Person(p(0), p(1).trim.toInt))
    +  .toDF()
    --- End diff --
    
    I think it's still fair to say that we are using reflection, since all the 
de/serializer expressions used in encoders are generated using reflection.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to