Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16856#discussion_r104255651
  
    --- Diff: docs/quick-start.md ---
    @@ -10,12 +10,13 @@ description: Quick start tutorial for Spark 
SPARK_VERSION_SHORT
     This tutorial provides a quick introduction to using Spark. We will first 
introduce the API through Spark's
     interactive shell (in Python or Scala),
     then show how to write applications in Java, Scala, and Python.
    -See the [programming guide](programming-guide.html) for a more complete 
reference.
     
     To follow along with this guide, first download a packaged release of 
Spark from the
     [Spark website](http://spark.apache.org/downloads.html). Since we won't be 
using HDFS,
     you can download a package for any version of Hadoop.
     
    +Note that, before Spark 2.0, the main programming interface of Spark was 
Resilient Distributed Dataset (RDD). After Spark 2.0, RDD is replaced by 
Dataset, which is strongly typed like RDD, but with richer optimizations under 
the hood. The RDD interface is still supported, and you can get a more complete 
reference at [RDD programming guide](rdd-programming-guide.html). However, we 
highly recommend you to switch to use Dataset, which has better performance 
than RDD. See [SQL programming guide](sql-programming-guide.html) to get more 
information about Dataset.
    --- End diff --
    
    Not a big deal, but it should probably say things like "_the_ Resilient 
Distributed Dataset" and "RDDs are replaced" vs "RDD is replaced" and "like an 
RDD" vs "like RDD"? and "the SQL programming guide", "the RDD programming guide"


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to