[
https://issues.apache.org/jira/browse/SPARK-20211?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15955285#comment-15955285
]
StanZhai commented on SPARK-20211:
----------------------------------
A workaround is difficult for me, because of all of my SQL are generated by a
high-level system, I cannot cast all columns as double.
FLOOR and CEIL are frequently used functions, and not all users will give a
feedback to the community when encounter this problem.
We should pay attention to the correctness of the SQL.
> `1 > 0.0001` throws Decimal scale (0) cannot be greater than precision (-2)
> exception
> -------------------------------------------------------------------------------------
>
> Key: SPARK-20211
> URL: https://issues.apache.org/jira/browse/SPARK-20211
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0, 2.0.1, 2.0.2, 2.1.0, 2.1.1
> Reporter: StanZhai
> Labels: correctness
>
> The following SQL:
> {code}
> select 1 > 0.0001 from tb
> {code}
> throws Decimal scale (0) cannot be greater than precision (-2) exception in
> Spark 2.x.
> `floor(0.0001)` and `ceil(0.0001)` have the same problem in Spark 1.6.x and
> Spark 2.x.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]