maropu commented on pull request #29404:
URL: https://github.com/apache/spark/pull/29404#issuecomment-674540569


   It seems this commit caused the valid test failure in `DataFarmeSuite`;
   ```
   [info] - SPARK-28224: Aggregate sum big decimal overflow *** FAILED *** (384 
milliseconds)
   [info]   org.apache.spark.SparkException: Job aborted due to stage failure: 
Task 0 in stage 77.0 failed 1 times, most recent failure: Lost task 0.0 in 
stage 77.0 (TID 197, 192.168.11.10, executor driver): 
java.lang.ArithmeticException: 
Decimal(expanded,111111111111111111110.246000000000000000,39,18}) cannot be 
represented as Decimal(38, 18).
   [info]       at 
org.apache.spark.sql.types.Decimal.toPrecision(Decimal.scala:369)
   [info]       at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.agg_doAggregate_sum_0$(Unknown
 Source)
   [info]       at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.agg_doConsume_0$(Unknown
 Source)
   [info]       at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.agg_doAggregateWithoutKey_0$(Unknown
 Source)
   [info]       at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown
 Source)
   [info]       at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
   ```
   
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-branch-3.0-test-maven-hadoop-2.7-hive-2.3/533/
   
   I've checked that the test failed w/this commit and  passed w/o it in my 
local env.
   @gengliangwang Could you check the failure?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to