[
https://issues.apache.org/jira/browse/SPARK-39865?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Gengliang Wang updated SPARK-39865:
-----------------------------------
Parent: SPARK-40182
Issue Type: Sub-task (was: Bug)
> Show proper error messages on the overflow errors of table insert
> -----------------------------------------------------------------
>
> Key: SPARK-39865
> URL: https://issues.apache.org/jira/browse/SPARK-39865
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.3.0, 3.4.0
> Reporter: Gengliang Wang
> Assignee: Gengliang Wang
> Priority: Major
> Fix For: 3.3.1
>
>
> In Spark 3.3, the error message of ANSI CAST is improved. However, the table
> insertion is using the same CAST expression:
> {code:java}
> > create table tiny(i tinyint);
> > insert into tiny values (1000);
> org.apache.spark.SparkArithmeticException[CAST_OVERFLOW]: The value 1000 of
> the type "INT" cannot be cast to "TINYINT" due to an overflow. Use `try_cast`
> to tolerate overflow and return NULL instead. If necessary set
> "spark.sql.ansi.enabled" to "false" to bypass this error.
> {code}
>
> Showing the hint of `If necessary set "spark.sql.ansi.enabled" to "false" to
> bypass this error` doesn't help at all. This PR is to fix the error message.
> After changes, the error message of this example will become:
> {code:java}
> org.apache.spark.SparkArithmeticException: [CAST_OVERFLOW_IN_TABLE_INSERT]
> Fail to insert a value of "INT" type into the "TINYINT" type column `i` due
> to an overflow. Use `try_cast` on the input value to tolerate overflow and
> return NULL instead.{code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]