skambha commented on a change in pull request #27629: [SPARK-28067][SQL]Fix
incorrect results during aggregate sum for decimal overflow by throwing
exception
URL: https://github.com/apache/spark/pull/27629#discussion_r380968792
##########
File path: sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala
##########
@@ -169,18 +169,14 @@ class DataFrameSuite extends QueryTest
DecimalData(BigDecimal("1"* 20 + ".123"), BigDecimal("1"* 20 + ".123"))
::
DecimalData(BigDecimal("9"* 20 + ".123"), BigDecimal("9"* 20 +
".123")) :: Nil).toDF()
- Seq(true, false).foreach { ansiEnabled =>
- withSQLConf((SQLConf.ANSI_ENABLED.key, ansiEnabled.toString)) {
+ Seq("true", "false").foreach { codegenEnabled =>
+ withSQLConf((SQLConf.WHOLESTAGE_CODEGEN_ENABLED.key, codegenEnabled)) {
val structDf = largeDecimals.select("a").agg(sum("a"))
- if (!ansiEnabled) {
Review comment:
SPARK-28224 took care of decimal overflow for sum only partially for 2
values. In this test case that was added as part of SPARK-28224, if you add
another row into the dataset, you will get incorrect results and not return
null on overflow.
In this PR we address decimal overflow in aggregate sum by throwing an
exception. Hence this test has been modified.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]