GitHub user mgaido91 opened a pull request: https://github.com/apache/spark/pull/20023
[SPARK-22036][SQL] Decimal multiplication with high precision/scale often returns NULL ## What changes were proposed in this pull request? When there is an operation between Decimals and the result is a number which is not representable exactly with the result's precision and scale, Spark is returning `NULL`. This was done to reflect Hive's behavior, but it is against SQL ANSI 2011, which states that "If the result cannot be represented exactly in the result type, then whether it is rounded or truncated is implementation-defined". Moreover, Hive now changed its behavior in order to respect the standard, thanks to HIVE-15331. Therefore, the PR propose to: - update the rules to determine the result precision and scale according to the new Hive's ones introduces in HIVE-15331, which reflect SQLServer behavior; - round the result of the operations, when it is not representable exactly with the result's precision and scale, instead of returning `NULL` ## How was this patch tested? modified and added UTs. Comparisons with results of Hive and SQLServer. You can merge this pull request into a Git repository by running: $ git pull https://github.com/mgaido91/spark SPARK-22036 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/20023.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #20023 ---- commit 3037d4aa6afc4d7630d86d29b8dd7d7d724cc990 Author: Marco Gaido <marcogaid...@gmail.com> Date: 2017-12-17T21:45:06Z [SPARK-22036][SQL] Decimal multiplication with high precision/scale often returns NULL ---- --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org