[ 
https://issues.apache.org/jira/browse/SPARK-40209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Max Gekk updated SPARK-40209:
-----------------------------
    Description: 
The example below demonstrates the issue:
{code:sql}
spark-sql> select cast(interval '10.123' second as decimal(1, 0));
[NUMERIC_VALUE_OUT_OF_RANGE] 0.000010 cannot be represented as Decimal(1, 0). 
If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
{code}

The value 0.000010 is not related to 10.123.

  was:
The example below demonstrates the issue:
{code:sql}
spark-sql> select cast(interval '10.123' second as decimal(1, 0));
[NUMERIC_VALUE_OUT_OF_RANGE] 0.000010 cannot be represented as Decimal(1, 0). 
If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
{code}



> Incorrect value in the error message of NUMERIC_VALUE_OUT_OF_RANGE
> ------------------------------------------------------------------
>
>                 Key: SPARK-40209
>                 URL: https://issues.apache.org/jira/browse/SPARK-40209
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Max Gekk
>            Assignee: Max Gekk
>            Priority: Major
>
> The example below demonstrates the issue:
> {code:sql}
> spark-sql> select cast(interval '10.123' second as decimal(1, 0));
> [NUMERIC_VALUE_OUT_OF_RANGE] 0.000010 cannot be represented as Decimal(1, 0). 
> If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
> {code}
> The value 0.000010 is not related to 10.123.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to