Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2346#discussion_r17510558
  
    --- Diff: 
core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala ---
    @@ -40,7 +41,9 @@ import org.apache.spark.rdd.{EmptyRDD, HadoopRDD, 
NewHadoopRDD, RDD}
      * A Java-friendly version of [[org.apache.spark.SparkContext]] that 
returns
      * [[org.apache.spark.api.java.JavaRDD]]s and works with Java collections 
instead of Scala ones.
      */
    -class JavaSparkContext(val sc: SparkContext) extends 
JavaSparkContextVarargsWorkaround {
    +class JavaSparkContext(val sc: SparkContext)
    +    extends JavaSparkContextVarargsWorkaround with Closeable {
    --- End diff --
    
    note two space indent here, but don't worry i will fix it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to