Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22957#discussion_r238650207
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/exchange/EnsureRequirements.scala
 ---
    @@ -145,9 +145,14 @@ case class EnsureRequirements(conf: SQLConf) extends 
Rule[SparkPlan] {
         assert(requiredChildDistributions.length == children.length)
         assert(requiredChildOrderings.length == children.length)
     
    +    val aliasMap = 
AttributeMap[Expression](children.flatMap(_.expressions.collect {
    +      case a: Alias => (a.toAttribute, a)
    +    }))
    +
         // Ensure that the operator's children satisfy their output 
distribution requirements.
         children = children.zip(requiredChildDistributions).map {
    -      case (child, distribution) if 
child.outputPartitioning.satisfies(distribution) =>
    +      case (child, distribution) if child.outputPartitioning.satisfies(
    +          distribution.mapExpressions(replaceAlias(_, aliasMap))) =>
    --- End diff --
    
    seems we are not on the same page...
    
    Let's make the example clearer. Assuming a `relation[a ,b]`'s partitioning 
is `hash(a, b)`, then `Project(a as c, a as d, b, relation)`'s partitioning 
should be `[hash(c, b), hash(d, b)]`. It's like a flatMap.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to