gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594520355



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with 
Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       > It's a bit risky to implement the "exact" methods ourselves, as JDK 
may update them in future versions. I'd rather add a try-catch to change the 
error message.
   
   So which one do you prefer? 
   
   1. Changing the error message of byte/short overflow as simply: 
"tinyint/smallint overflow" (I check PostgreSQL and it  also simply show error 
messages like "ERROR: integer out of range")
   2. Add try/catch in int/long arithmetic operations and throw a new exception 
with details.
   3. Keep the current situation and don't do the unification.
   
   cc @MaxGekk @cloud-fan @maropu 
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to