[ 
https://issues.apache.org/jira/browse/SPARK-40351?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17600535#comment-17600535
 ] 

Yuming Wang commented on SPARK-40351:
-------------------------------------

https://github.com/apache/spark/blob/v3.3.0/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/Sum.scala#L52-L53

Why do you want to override this value?

> Spark Sum increases the precision of DecimalType arguments by 10
> ----------------------------------------------------------------
>
>                 Key: SPARK-40351
>                 URL: https://issues.apache.org/jira/browse/SPARK-40351
>             Project: Spark
>          Issue Type: Question
>          Components: Optimizer
>    Affects Versions: 3.2.0
>            Reporter: Tymofii
>            Priority: Minor
>
> Currently in Spark automatically increases Decimal field by 10 (hard coded 
> value) after SUM aggregate operation - 
> [https://github.com/apache/spark/blob/branch-3.2/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala#L1877.]
> There are a couple of questions:
>  # Why was 10 chosen as default one?
>  # Is it make sense to allow the user to override this value via 
> configuration? 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to