Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/7403#discussion_r38458893
  
    --- Diff: core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala 
---
    @@ -565,12 +603,25 @@ class PairRDDFunctions[K, V](self: RDD[(K, V)])
       }
     
       /**
    -   * Simplified version of combineByKey that hash-partitions the resulting 
RDD using the
    -   * existing partitioner/parallelism level.
    +   * This method is here for backward compatibility. It
    +   * does not provide combiner classtag information to
    +   * the shuffle.
    +   *
    +   * @see [[combineByKeyWithClassTag]]
        */
       def combineByKey[C](createCombiner: V => C, mergeValue: (C, V) => C, 
mergeCombiners: (C, C) => C)
    -    : RDD[(K, C)] = self.withScope {
    -    combineByKey(createCombiner, mergeValue, mergeCombiners, 
defaultPartitioner(self))
    +    : RDD[(K, C)] = {
    +    combineByKeyWithClassTag(createCombiner, mergeValue, 
mergeCombiners)(null)
    +  }
    +
    +  /**
    +   * Simplified version of combineByKeyWithClassTag that hash-partitions 
the resulting RDD using the
    +   * existing partitioner/parallelism level.
    +   */
    +  def combineByKeyWithClassTag[C](createCombiner: V => C, mergeValue: (C, 
V) => C,
    +                                  mergeCombiners: (C, C) => C)
    +    (implicit ct: ClassTag[C]): RDD[(K, C)] = self.withScope {
    --- End diff --
    
    For these internal methods `*withClassTag` we should not have 
`self.withScope` here. All that does is encode the RDD operation and display it 
to the user on the UI, which we don't want.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to