[ https://issues.apache.org/jira/browse/SPARK-40389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17601684#comment-17601684 ]
Apache Spark commented on SPARK-40389: -------------------------------------- User 'gengliangwang' has created a pull request for this issue: https://github.com/apache/spark/pull/37832 > Decimals can't upcast as integral types if the cast can overflow > ---------------------------------------------------------------- > > Key: SPARK-40389 > URL: https://issues.apache.org/jira/browse/SPARK-40389 > Project: Spark > Issue Type: Task > Components: SQL > Affects Versions: 3.4.0 > Reporter: Gengliang Wang > Assignee: Gengliang Wang > Priority: Major > > In Spark SQL, the method "canUpCast" returns true iff we can safely up-cast > the `from` type to `to` type without any truncating or precision lose or > possible runtime failures. > Meanwhile, DecimalType(10, 0) is considered as "canUpCast" to Integer type. > This is wrong, since casting 9000000000BD as Integer type will overflow. > As a result: > * The optimizer rule SimplifyCasts replies on the method "canUpCast" and it > will mistakenly convert "cast(cast(9000000000BD as int) as long)" as > "cast(9000000000BD as long)" > * The STRICT store assignment policy relies on this method too. With the > policy enabled, inserting 9000000000BD into integer columns will pass > compiling time check and insert an unexpected value 410065408. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org