[
https://issues.apache.org/jira/browse/FLINK-36378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17885610#comment-17885610
]
Jing Ge commented on FLINK-36378:
---------------------------------
afaik, precision must be greater than scale according to
https://github.com/apache/flink/blob/6633719c728526a08344645e0a844f27d25629f4/flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/logical/DecimalType.java#L74
> type extraction problem between java BigDecimal and sql decimal
> ---------------------------------------------------------------
>
> Key: FLINK-36378
> URL: https://issues.apache.org/jira/browse/FLINK-36378
> Project: Flink
> Issue Type: Bug
> Components: Table SQL / Planner
> Affects Versions: 2.0-preview
> Reporter: Jacky Lau
> Assignee: Jacky Lau
> Priority: Major
> Labels: pull-request-available
> Fix For: 2.0-preview
>
>
> add following to ValueDataTypeConverterTest
> of(new BigDecimal("0.000"), DataTypes.DECIMAL(4, 3))
> org.apache.flink.table.api.ValidationException:
> Decimal scale must be between 0 and the precision 1 (both inclusive).
>
> spark
> 0.000 -> decimal(3, 3)
> calcite
> 0.000 -> decimal(4, 3)
> so follow the calcite
--
This message was sent by Atlassian Jira
(v8.20.10#820010)