cloud-fan commented on a change in pull request #26656: [SPARK-27986][SQL] 
Support ANSI SQL filter clause for aggregate expression
URL: https://github.com/apache/spark/pull/26656#discussion_r360296384
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/RewriteDistinctAggregates.scala
 ##########
 @@ -194,7 +238,11 @@ object RewriteDistinctAggregates extends 
Rule[LogicalPlan] {
       val regularAggOperatorMap = regularAggExprs.map { e =>
         // Perform the actual aggregation in the initial aggregate.
         val af = 
patchAggregateFunctionChildren(e.aggregateFunction)(regularAggChildAttrLookup.get)
-        val operator = Alias(e.copy(aggregateFunction = af), e.sql)()
+        val filterOpt = e.filter.map { fe =>
+          val newChildren = fe.children.map(c => 
regularAggChildAttrLookup.getOrElse(c, c))
+          fe.withNewChildren(newChildren)
 
 Review comment:
   ah I got it after reading `expressionAttributePair`
   
   We change the attributes in the `Expand` output so we need to replace the 
attributes in filter conditions with new ones.
   
   But the code here is wrong. `.children` doesn't always give us attributes, 
e.g. `a + b > 1`. We should do
   ```
   e.filter.map(_.transform {
     case a: Attribute => regularAggChildAttrLookup.getOrElse(a, a)
   })
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to