Github user Ngone51 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21036#discussion_r180650706
  
    --- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
    @@ -86,8 +88,7 @@ private[spark] class HadoopPartition(rddId: Int, override 
val index: Int, s: Inp
      * @param keyClass Class of the key associated with the inputFormatClass.
      * @param valueClass Class of the value associated with the 
inputFormatClass.
      * @param minPartitions Minimum number of HadoopRDD partitions (Hadoop 
Splits) to generate.
    - *
    - * @note Instantiating this class directly is not recommended, please use
    +  * @note Instantiating this class directly is not recommended, please use
    --- End diff --
    
    ditto.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to