[ 
https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16282799#comment-16282799
 ] 

Rong Rong commented on FLINK-8215:
----------------------------------

Agree. as long as the ValidationException is thrown for TableAPI, we could go 
ahead and make the codegen to support type widening by adding in better type 
cast. Will go with option #2 then.

> Collections codegen exception when constructing Array or Map via SQL API
> ------------------------------------------------------------------------
>
>                 Key: FLINK-8215
>                 URL: https://issues.apache.org/jira/browse/FLINK-8215
>             Project: Flink
>          Issue Type: Bug
>          Components: Table API & SQL
>            Reporter: Rong Rong
>            Assignee: Rong Rong
>
> TableAPI goes through `LogicalNode.validate()`, which brings up the 
> collection validation and rejects inconsistent type, this will throw 
> `ValidationExcpetion` for something like `array(1.0, 2.0f)`.
> SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode 
> validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS 
> DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception.
> Root cause is the CodeGeneration for these collection value constructors does 
> not cast or resolve leastRestrictive type correctly. I see 2 options:
> 1. Strengthen validation to not allow resolving leastRestrictive type on SQL.
> 2. Making codegen support leastRestrictive type cast, such as using 
> `generateCast` instead of direct casting like `(ClassType) element`.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to