mgaido91 commented on a change in pull request #20350: [SPARK-23179][SQL]
Support option to throw exception if overflow occurs during Decimal arithmetic
URL: https://github.com/apache/spark/pull/20350#discussion_r297526766
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1441,6 +1441,16 @@ object SQLConf {
.booleanConf
.createWithDefault(true)
+ val DECIMAL_OPERATIONS_NULL_ON_OVERFLOW =
+ buildConf("spark.sql.decimalOperations.nullOnOverflow")
Review comment:
Thanks for your comments @JoshRosen.
Yes, this deals with the overflow case. The underflow (or precision loss) is
handled in a different way and the behavior depends on another config (see
SPARK-22036): it either avoids precision loss, causing eventually overflow (old
behavior) or truncates (as defined by the SQL standard and following closely
SQL server behavior from which we derived our decimal operations
implementation). So this flag is related only to the overflow case.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]