Github user gzm0 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3262#discussion_r20702584
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -1427,47 +1427,74 @@ object SparkContext extends Logging {
     
       private[spark] val DRIVER_IDENTIFIER = "<driver>"
     
    -  implicit object DoubleAccumulatorParam extends AccumulatorParam[Double] {
    +  // The following deprecated objects have already been copied to `object 
AccumulatorParam` to
    +  // make the compiler find them automatically. They are duplicate codes 
only for backward
    +  // compatibility, please update `object AccumulatorParam` accordingly if 
you plan to modify the
    +  // following ones.
    +
    +  @deprecated("Replaced by implicit objects in AccumulatorParam. This is 
kept here only for " +
    +    "backward compatibility.", "1.2.0")
    +  object DoubleAccumulatorParam extends AccumulatorParam[Double] {
         def addInPlace(t1: Double, t2: Double): Double = t1 + t2
         def zero(initialValue: Double) = 0.0
       }
     
    -  implicit object IntAccumulatorParam extends AccumulatorParam[Int] {
    +  @deprecated("Replaced by implicit objects in AccumulatorParam. This is 
kept here only for " +
    +    "backward compatibility.", "1.2.0")
    +  object IntAccumulatorParam extends AccumulatorParam[Int] {
         def addInPlace(t1: Int, t2: Int): Int = t1 + t2
         def zero(initialValue: Int) = 0
       }
     
    -  implicit object LongAccumulatorParam extends AccumulatorParam[Long] {
    +  @deprecated("Replaced by implicit objects in AccumulatorParam. This is 
kept here only for " +
    +    "backward compatibility.", "1.2.0")
    +  object LongAccumulatorParam extends AccumulatorParam[Long] {
         def addInPlace(t1: Long, t2: Long) = t1 + t2
         def zero(initialValue: Long) = 0L
       }
     
    -  implicit object FloatAccumulatorParam extends AccumulatorParam[Float] {
    +  @deprecated("Replaced by implicit objects in AccumulatorParam. This is 
kept here only for " +
    +    "backward compatibility.", "1.2.0")
    +  object FloatAccumulatorParam extends AccumulatorParam[Float] {
         def addInPlace(t1: Float, t2: Float) = t1 + t2
         def zero(initialValue: Float) = 0f
       }
     
    -  // TODO: Add AccumulatorParams for other types, e.g. lists and strings
    +  // The following deprecated functions have already been moved to `object 
RDD` to
    +  // make the compiler find them automatically. They are still kept here 
for backward compatibility
    +  // and just call the corresponding functions in `object RDD`.
     
    -  implicit def rddToPairRDDFunctions[K, V](rdd: RDD[(K, V)])
    +  @deprecated("Replaced by implicit functions in org.apache.spark.rdd 
package object. This is " +
    --- End diff --
    
    All these comments are outdated (still refer to package object, but should 
refer to `RDD` companion)>


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to