HyukjinKwon commented on code in PR #44460:
URL: https://github.com/apache/spark/pull/44460#discussion_r1436181963
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala:
##########
@@ -1706,14 +1706,18 @@ object PushPredicateThroughNonJoin extends
Rule[LogicalPlan] with PredicateHelpe
// implies that, for a given input row, the output are determined by the
expression's initial
// state and all the input rows processed before. In another word, the
order of input rows
// matters for non-deterministic expressions, while pushing down
predicates changes the order.
- // This also applies to Aggregate.
case Filter(condition, project @ Project(fields, grandChild))
if fields.forall(_.deterministic) && canPushThroughCondition(grandChild,
condition) =>
val aliasMap = getAliasMap(project)
project.copy(child = Filter(replaceAlias(condition, aliasMap),
grandChild))
+ // Push [[Filter]] operators through [[Aggregate]] operators. Parts of the
predicates that can
Review Comment:
Let's use backquotes here instead of `[[...]]`. `[[..]]` is a Scaladoc
syntax so we don't need to do this here.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]