Github user bersprockets commented on a diff in the pull request:
https://github.com/apache/spark/pull/20350#discussion_r163269198
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1074,6 +1074,16 @@ object SQLConf {
.booleanConf
.createWithDefault(true)
+ val DECIMAL_OPERATIONS_NULL_ON_OVERFLOW =
+ buildConf("spark.sql.decimalOperations.nullOnOverflow")
+ .internal()
+ .doc("When true (default), if an overflow on a decimal occurs, then
NULL is returned. " +
+ "Spark's older versions and Hive behave in this way. If turned to
false, SQL ANSI 2011 " +
+ "specification, will be followed instead: an arithmetic exception
is thrown. This is " +
+ "what most of the SQL databases do.")
--- End diff --
Tiny nit:
If turned to false, SQL ANSI 2011 specification, will be followed instead
This should be
If turned to false, SQL ANSI 2011 specification will be followed instead
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]