Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9845#discussion_r45404702
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/SqlNewHadoopRDD.scala
 ---
    @@ -48,26 +49,26 @@ private[spark] class SqlNewHadoopPartition(
     }
     
     /**
    - * An RDD that provides core functionality for reading data stored in 
Hadoop (e.g., files in HDFS,
    - * sources in HBase, or S3), using the new MapReduce API 
(`org.apache.hadoop.mapreduce`).
    - * It is based on [[org.apache.spark.rdd.NewHadoopRDD]]. It has three 
additions.
    - * 1. A shared broadcast Hadoop Configuration.
    - * 2. An optional closure `initDriverSideJobFuncOpt` that set 
configurations at the driver side
    - *    to the shared Hadoop Configuration.
    - * 3. An optional closure `initLocalJobFuncOpt` that set configurations at 
both the driver side
    - *    and the executor side to the shared Hadoop Configuration.
    - *
    - * Note: This is RDD is basically a cloned version of 
[[org.apache.spark.rdd.NewHadoopRDD]] with
    - * changes based on [[org.apache.spark.rdd.HadoopRDD]].
    - */
    +  * An RDD that provides core functionality for reading data stored in 
Hadoop (e.g., files in HDFS,
    --- End diff --
    
    two things...
    
    1. you should use intellij simple paste (right click and then "simple 
paste") to avoid intellij auto formatting the code.
    2. update your intellij setting so it doesn't use scaladoc format, but use 
javadoc. in "preferences -> editor -> code style -> scala -> spaces tab -> 
Other -> uncheck "Use formatting for ScalaDoc2 options"
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to