maropu commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594790320
##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
private[sql] object ByteExactNumeric extends ByteIsIntegral with
Ordering.ByteOrdering {
private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
if (res > Byte.MaxValue || res < Byte.MinValue) {
- throw new ArithmeticException(s"$x $op $y caused overflow.")
Review comment:
> For Int/Long, the message is "int/long overflow" since Spark is
calling the "*Exact"(e.g. addExact, negateExact) methods from java.lang.Math.
The unification itself looks nice and I think we should add try-catch in the
int/log cases first.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]