mgaido91 commented on a change in pull request #25144: [SPARK-28369][SQL] Honor 
spark.sql.decimalOperations.nullOnOverflow in ScalaUDF result
URL: https://github.com/apache/spark/pull/25144#discussion_r303346436
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/Decimal.scala
 ##########
 @@ -414,20 +414,12 @@ final class Decimal extends Ordered[Decimal] with 
Serializable {
 
   def floor: Decimal = if (scale == 0) this else {
     val newPrecision = DecimalType.bounded(precision - scale + 1, 0).precision
-    val res = toPrecision(newPrecision, 0, ROUND_FLOOR)
-    if (res == null) {
-      throw new AnalysisException(s"Overflow when setting precision to 
$newPrecision")
-    }
-    res
+    toPrecision(newPrecision, 0, ROUND_FLOOR, nullOnOverflow = false)
   }
 
   def ceil: Decimal = if (scale == 0) this else {
     val newPrecision = DecimalType.bounded(precision - scale + 1, 0).precision
-    val res = toPrecision(newPrecision, 0, ROUND_CEILING)
-    if (res == null) {
-      throw new AnalysisException(s"Overflow when setting precision to 
$newPrecision")
-    }
-    res
+    toPrecision(newPrecision, 0, ROUND_CEILING, nullOnOverflow = false)
 
 Review comment:
   well, I don't think that is really an issue here. I mean, I see no way ceil 
and floor can produce an overflow, they rather reduce the needed precision. So 
I think this case cannot really happen and it is fine to just throw an exception

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to