[ 
https://issues.apache.org/jira/browse/SPARK-40389?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang updated SPARK-40389:
-----------------------------------
    Affects Version/s: 3.3.1

> Decimals can't upcast as integral types if the cast can overflow
> ----------------------------------------------------------------
>
>                 Key: SPARK-40389
>                 URL: https://issues.apache.org/jira/browse/SPARK-40389
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.4.0, 3.3.1
>            Reporter: Gengliang Wang
>            Assignee: Gengliang Wang
>            Priority: Major
>             Fix For: 3.4.0, 3.3.1
>
>
> In Spark SQL, the method "canUpCast" returns true iff we can safely up-cast 
> the `from` type to `to` type without any truncating or precision lose or 
> possible runtime failures.
> Meanwhile, DecimalType(10, 0) is considered as "canUpCast" to Integer type. 
> This is wrong, since casting 9000000000BD as Integer type will overflow.
> As a result:
>  * The optimizer rule SimplifyCasts replies on the method "canUpCast" and it 
> will mistakenly convert "cast(cast(9000000000BD as int) as long)" as 
> "cast(9000000000BD as long)"
>  * The STRICT store assignment policy relies on this method too. With the 
> policy enabled, inserting 9000000000BD into integer columns will pass 
> compiling time check and insert an unexpected value 410065408.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to