Tymofii created SPARK-40351:
-------------------------------

             Summary: Spark Sum increases the precision of DecimalType 
arguments by 10
                 Key: SPARK-40351
                 URL: https://issues.apache.org/jira/browse/SPARK-40351
             Project: Spark
          Issue Type: Question
          Components: Optimizer
    Affects Versions: 3.2.0
            Reporter: Tymofii


Currently in Spark automatically increases Decimal field by 10 (hard coded 
value) after SUM aggregate operation - 
[https://github.com/apache/spark/blob/branch-3.2/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala#L1877.]

There are a couple of questions:
 # Why was 10 chosen as default one?
 # Is it make sense to allow the user to override this value via configuration? 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to