Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21852#discussion_r206500527
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/expressions.scala
---
@@ -416,6 +416,21 @@ object SimplifyConditionals extends Rule[LogicalPlan]
with PredicateHelper {
// these branches can be pruned away
val (h, t) = branches.span(_._1 != TrueLiteral)
CaseWhen( h :+ t.head, None)
+
+ case e @ CaseWhen(branches, Some(elseValue)) if branches
+ .forall(_._2.semanticEquals(elseValue)) =>
+ // For non-deterministic conditions with side effect, we can not
remove it, or change
+ // the ordering. As a result, we try to remove the deterministic
conditions from the tail.
--- End diff --
I think it's more readable to write java style code here
```
var hitNonDetermin = false
var i = branches.length
while (i > 0 && !hitNonDetermin) {
hitNonDetermin = !branches(i - 1).deterministic
i -= 1
}
if (i == 0) {
elseValue
} else {
e.copy(branches = branches.take(i))
}
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]