cloud-fan commented on code in PR #49202:
URL: https://github.com/apache/spark/pull/49202#discussion_r1905190707
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala:
##########
@@ -1978,6 +1977,55 @@ object PushPredicateThroughNonJoin extends
Rule[LogicalPlan] with PredicateHelpe
case _ => false
}
}
+
+ private def rewriteCondition(
+ cond: Expression,
+ aliasMap: AttributeMap[Alias]): Expression = {
+ replaceAlias(rewriteConditionByWith(cond, aliasMap), aliasMap)
+ }
+
+ /**
+ * Use [[With]] to rewrite condition which contains attribute that are not
cheap and be consumed
+ * multiple times. Each predicate generates one or 0 With. For facilitates
subsequent merge
+ * [[With]], use the same CommonExpressionDef ids for different [[With]].
+ */
+ private def rewriteConditionByWith(
+ cond: Expression,
+ aliasMap: AttributeMap[Alias]): Expression = {
+ if (!SQLConf.get.getConf(SQLConf.ALWAYS_INLINE_COMMON_EXPR)) {
+ val replaceWithMap = cond.collect {case a: Attribute => a }
+ .groupBy(identity)
+ .transform((_, v) => v.size)
+ .filter(m => aliasMap.contains(m._1) && m._2 > 1)
Review Comment:
I don't think we need to check ref count anymore. Once we push down
predicate through a project/aggregate, we already introduce repeated execution
(Filter and Project/Aggregate) and we should use With to optimize it.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]