cloud-fan commented on code in PR #49202:
URL: https://github.com/apache/spark/pull/49202#discussion_r1905202971
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala:
##########
@@ -1811,7 +1811,7 @@ object PushPredicateThroughNonJoin extends
Rule[LogicalPlan] with PredicateHelpe
case Filter(condition, project @ Project(fields, grandChild))
if fields.forall(_.deterministic) && canPushThroughCondition(grandChild,
condition) =>
val aliasMap = getAliasMap(project)
- project.copy(child = Filter(replaceAlias(condition, aliasMap),
grandChild))
+ project.copy(child = Filter(rewriteCondition(condition, aliasMap),
grandChild))
Review Comment:
This also means that when we push down predicate through other operators,
they should propagate the attributes. Looking at `def canPushThrough`, I think
most of them are fine (they propagate all output attribute from child), but
`Expand` need update, we need to append the attributes that will be generated
by `With` to `Expand#projections` and `output`.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]