Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/17993#discussion_r118846885 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala --- @@ -54,6 +54,62 @@ object ConstantFolding extends Rule[LogicalPlan] { } } +/** + * Substitutes [[Attribute Attributes]] which can be statically evaluated with their corresponding + * value in conjunctive [[Expression Expressions]] + * eg. + * {{{ + * SELECT * FROM table WHERE i = 5 AND j = i + 3 + * ==> SELECT * FROM table WHERE i = 5 AND j = 8 + * }}} + * + * Approach used: + * - Start from AND operator as the root + * - Get all the children conjunctive predicates which are EqualTo / EqualNullSafe such that they + * don't have a `NOT` or `OR` operator in them + * - Populate a mapping of attribute => constant value by looking at all the equals predicates + * - Using this mapping, replace occurrence of the attributes with the corresponding constant values + * in the AND node. + */ +object ConstantPropagation extends Rule[LogicalPlan] with PredicateHelper { + def containsNonConjunctionPredicates(expression: Expression): Boolean = expression.find { + case _: Not | _: Or => true + case _ => false + }.isDefined + + def apply(plan: LogicalPlan): LogicalPlan = plan transform { + case f: Filter => f transformExpressionsUp { --- End diff -- Maybe I am being myopic here but the result should be the same right? The only way this regresses is when we plan a `CartesianProduct` instead of an `BroadcastNestedLoopJoin`... I am fine with not optimizing this for now, it would be nice if these constraints are at least generated here.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org