[ 
https://issues.apache.org/jira/browse/SPARK-26664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-26664:
----------------------------
    Target Version/s: 3.0.0

> Make DecimalType's minimum adjusted scale configurable
> ------------------------------------------------------
>
>                 Key: SPARK-26664
>                 URL: https://issues.apache.org/jira/browse/SPARK-26664
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Kris Mok
>            Priority: Minor
>
> Introduce a new conf flag that allows the user to set the value of 
> {{DecimalType.MINIMAL_ADJUSTED_SCALE}}, currently a constant of 6, to match 
> their workloads' needs.
> The new flag will be {{spark.sql.decimalOperations.minimumAdjustedScale}}.
> SPARK-22036 introduced a new conf flag 
> {{spark.sql.decimalOperations.allowPrecisionLoss}} to match SQL Server's and 
> new Hive's behavior of allowing precision loss when performing 
> multiplication/division on big and small decimal numbers.
> Along with this feature, a fixed {{MINIMAL_ADJUSTED_SCALE}} was set to 6 for 
> when precision loss is allowed.
> Some customer workload may needed a larger adjusted scale to match their 
> business needs, and in exchange they may be willing to tolerate some more 
> calculations overflowing the max precision, leading to nulls. So they would 
> like the minimum adjusted scale to be configurable. Thus the need for a new 
> conf.
> The default behavior after introducing this conf flag is not changed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to