cloud-fan commented on a change in pull request #20350: [SPARK-23179][SQL] 
Support option to throw exception if overflow occurs
URL: https://github.com/apache/spark/pull/20350#discussion_r296441515
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
 ##########
 @@ -1441,6 +1441,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val DECIMAL_OPERATIONS_NULL_ON_OVERFLOW =
+    buildConf("spark.sql.decimalOperations.nullOnOverflow")
 
 Review comment:
   A DB does not have to follow the SQL standard completely in every corners. 
The current behavior in Spark is by design and I don't think that's nonsense.
   
   I do agree that it's a valid requirement that some users want overflow to 
fail, but it should be protected by a config.
   
   My question is if we need one config for overflow, or 2 configs for decimal 
and non-decimal.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to