Zawa-ll commented on code in PR #47246:
URL: https://github.com/apache/spark/pull/47246#discussion_r1669382233
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/predicates.scala:
##########
@@ -998,10 +998,27 @@ abstract class BinaryComparison extends BinaryOperator
with Predicate {
}
}
- protected lazy val ordering: Ordering[Any] =
TypeUtils.getInterpretedOrdering(left.dataType)
+ protected lazy val ordering: Ordering[Any] = new Ordering[Any] {
Review Comment:
Thank you for your suggestion!
I think your solution is more robust as it utilizes compareTo for string
comparisons and falls back to TypeUtils.getInterpretedOrdering for other types.
The logic I was about to commit is as below, but I wasn't sure which way
would be better
@Masykus Can you please take a look? Thank you!
```suggestion
override def compare(x: Any, y: Any): Int = {
(x, y) match {
case (xs: String, yi: Int) => xs.compare(yi.toString)
case (xi: Int, ys: String) => xi.toString.compare(ys)
case _ => super.compare(x, y)
}
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]