HyukjinKwon commented on a change in pull request #21848:
URL: https://github.com/apache/spark/pull/21848#discussion_r549215158
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala
##########
@@ -390,6 +390,8 @@ object SimplifyConditionals extends Rule[LogicalPlan] with
PredicateHelper {
case If(TrueLiteral, trueValue, _) => trueValue
case If(FalseLiteral, _, falseValue) => falseValue
case If(Literal(null, _), _, falseValue) => falseValue
+ case If(cond, trueValue, falseValue)
+ if cond.deterministic && trueValue.semanticEquals(falseValue) =>
trueValue
Review comment:
Looks like we'll still have this problem by skipping the evaluation of
`cond` ..
Lately, SPARK-33544 introduced another approach for that. I think that
superseded SPARK-24913. I think we can switch it to use SPARK-33544 approach.
@dbtsai, can we try and follow up it with using `NoThrow`?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]