gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594497283



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with 
Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       Yes, I thought about this. 
   On the other way, we can align the error message of int/log to the error 
message of byte/short, which is more user-friendly. Then basically we have to 
re-implement the "exact" methods in Spark.
   cc @cloud-fan 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to