[
https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Timo Walther resolved FLINK-8215.
---------------------------------
Resolution: Fixed
Fix Version/s: 1.5.0
Fixed in 1.5: 2142eeda9df262e989951c4b31273cbd9346567f
> Support implicit type widening for array/map constructors in SQL
> ----------------------------------------------------------------
>
> Key: FLINK-8215
> URL: https://issues.apache.org/jira/browse/FLINK-8215
> Project: Flink
> Issue Type: Bug
> Components: Table API & SQL
> Reporter: Rong Rong
> Assignee: Rong Rong
> Fix For: 1.5.0
>
>
> TableAPI goes through `LogicalNode.validate()`, which brings up the
> collection validation and rejects inconsistent type, this will throw
> `ValidationExcpetion` for something like `array(1.0, 2.0f)`.
> SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode
> validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS
> DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception.
> Root cause is the CodeGeneration for these collection value constructors does
> not cast or resolve leastRestrictive type correctly. I see 2 options:
> 1. Strengthen validation to not allow resolving leastRestrictive type on SQL.
> 2. Making codegen support leastRestrictive type cast, such as using
> `generateCast` instead of direct casting like `(ClassType) element`.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)