peter-toth commented on code in PR #40268:
URL: https://github.com/apache/spark/pull/40268#discussion_r1126350062


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala:
##########
@@ -198,16 +192,15 @@ object ConstantPropagation extends Rule[LogicalPlan] {
   private def safeToReplace(ar: AttributeReference, nullIsFalse: Boolean) =
     !ar.nullable || nullIsFalse
 
-  private def replaceConstants(condition: Expression, equalityPredicates: 
EqualityPredicates)
-    : Expression = {
-    val constantsMap = AttributeMap(equalityPredicates.map(_._1))
-    val predicates = equalityPredicates.map(_._2).toSet
-    def replaceConstants0(expression: Expression) = expression transform {
-      case a: AttributeReference => constantsMap.getOrElse(a, a)
-    }
+  private def replaceConstants(
+      condition: Expression,
+      equalityPredicates: EqualityPredicates): Expression = {
+    val predicates = equalityPredicates.values.map(_._2).toSet

Review Comment:
   The reason for changing `EqualityPredicates` to `mutable.Map` earlier was to 
avoid building a map here.
   
   The main point of that map is that we store only one `Literal` (and its 
original `BinaryComparision`) assigned to an attribute key. So if we have 2 or 
more conflicting `EqualTo` then in `replaceConstants()` we keep only one's 
original form and rewrite the other conflicing ones. E.g. `a = 1 AND a = 2` we 
store only `a -> (2, a = 2)` in the map and rewrite the expression to `2 = 1 
AND a = 2`.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala:
##########
@@ -112,16 +113,13 @@ object ConstantFolding extends Rule[LogicalPlan] {
 object ConstantPropagation extends Rule[LogicalPlan] {
   def apply(plan: LogicalPlan): LogicalPlan = plan.transformUpWithPruning(
     _.containsAllPatterns(LITERAL, FILTER), ruleId) {
-    case f: Filter =>
-      val (newCondition, _) = traverse(f.condition, replaceChildren = true, 
nullIsFalse = true)
-      if (newCondition.isDefined) {
-        f.copy(condition = newCondition.get)
-      } else {
-        f
-      }
+    case f: Filter => f.mapExpressions(traverse(_, replaceChildren = true, 
nullIsFalse = true)._1)
   }
 
-  type EqualityPredicates = Seq[((AttributeReference, Literal), 
BinaryComparison)]
+  // The keys are always canonicalized `AttributeReference`s, but it is easier 
to use `Expression`
+  // type keys here instead of casting `AttributeReference.canonicalized` to 
`AttributeReference` at
+  // the calling sites.
+  type EqualityPredicates = mutable.Map[Expression, (Literal, 
BinaryComparison)]

Review Comment:
   Actually, I deliberately used mutable map here to improve their addition 
(`equalityPredicates ++= equalityPredicatesRight`) later.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to