[jira] [Commented] (FLINK-8215) Collections codegen exception when constructing Array or Map via SQL API

2017-12-19 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16296576#comment-16296576
 ] 

ASF GitHub Bot commented on FLINK-8215:
---

Github user twalthr commented on the issue:

https://github.com/apache/flink/pull/5148
  
Thank for this fix @walterddr. The code looks good. I will merge this...


> Collections codegen exception when constructing Array or Map via SQL API
> 
>
> Key: FLINK-8215
> URL: https://issues.apache.org/jira/browse/FLINK-8215
> Project: Flink
>  Issue Type: Bug
>  Components: Table API & SQL
>Reporter: Rong Rong
>Assignee: Rong Rong
>
> TableAPI goes through `LogicalNode.validate()`, which brings up the 
> collection validation and rejects inconsistent type, this will throw 
> `ValidationExcpetion` for something like `array(1.0, 2.0f)`.
> SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode 
> validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS 
> DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception.
> Root cause is the CodeGeneration for these collection value constructors does 
> not cast or resolve leastRestrictive type correctly. I see 2 options:
> 1. Strengthen validation to not allow resolving leastRestrictive type on SQL.
> 2. Making codegen support leastRestrictive type cast, such as using 
> `generateCast` instead of direct casting like `(ClassType) element`.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (FLINK-8215) Collections codegen exception when constructing Array or Map via SQL API

2017-12-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16286763#comment-16286763
 ] 

ASF GitHub Bot commented on FLINK-8215:
---

GitHub user walterddr opened a pull request:

https://github.com/apache/flink/pull/5148

[FLINK-8215][Table] fix codegen issue on array/map value constructor 
dealing with implicit type cast

## What is the purpose of the change

This pull request fix the codegen issue when value constructor on SQL API.
On Table API we enforce strict type compatibility and no implicit type cast 
is allowed. However on SQL API, Calcite `SqlMultisetValueConstructor` is trying 
to process some combination of types where `leastRestrictive` type is resolved 
correctly. 

A type-cast-safe method is introduce to avoid such codegen excpetion.


## Brief change log

- Added in `generateCast` before boxing the nullable item
- Added in unit-test for Map / Array value constructor


## Verifying this change

- Unit tests are added.

## Does this pull request potentially affect one of the following parts:

No

## Documentation

No new feature required.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/walterddr/flink FLINK-8215

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5148.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5148


commit f8eb55d80fc5232f7989e3e48666257bd4383042
Author: Rong Rong 
Date:   2017-12-11T22:55:49Z

before generating output type boxing, try a generateCast on the expression 
beforehand. This is to avoid doing an invalid type cast directly such as: 
(Java.util.Double) 1.0f.




> Collections codegen exception when constructing Array or Map via SQL API
> 
>
> Key: FLINK-8215
> URL: https://issues.apache.org/jira/browse/FLINK-8215
> Project: Flink
>  Issue Type: Bug
>  Components: Table API & SQL
>Reporter: Rong Rong
>Assignee: Rong Rong
>
> TableAPI goes through `LogicalNode.validate()`, which brings up the 
> collection validation and rejects inconsistent type, this will throw 
> `ValidationExcpetion` for something like `array(1.0, 2.0f)`.
> SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode 
> validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS 
> DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception.
> Root cause is the CodeGeneration for these collection value constructors does 
> not cast or resolve leastRestrictive type correctly. I see 2 options:
> 1. Strengthen validation to not allow resolving leastRestrictive type on SQL.
> 2. Making codegen support leastRestrictive type cast, such as using 
> `generateCast` instead of direct casting like `(ClassType) element`.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (FLINK-8215) Collections codegen exception when constructing Array or Map via SQL API

2017-12-07 Thread Rong Rong (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16282799#comment-16282799
 ] 

Rong Rong commented on FLINK-8215:
--

Agree. as long as the ValidationException is thrown for TableAPI, we could go 
ahead and make the codegen to support type widening by adding in better type 
cast. Will go with option #2 then.

> Collections codegen exception when constructing Array or Map via SQL API
> 
>
> Key: FLINK-8215
> URL: https://issues.apache.org/jira/browse/FLINK-8215
> Project: Flink
>  Issue Type: Bug
>  Components: Table API & SQL
>Reporter: Rong Rong
>Assignee: Rong Rong
>
> TableAPI goes through `LogicalNode.validate()`, which brings up the 
> collection validation and rejects inconsistent type, this will throw 
> `ValidationExcpetion` for something like `array(1.0, 2.0f)`.
> SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode 
> validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS 
> DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception.
> Root cause is the CodeGeneration for these collection value constructors does 
> not cast or resolve leastRestrictive type correctly. I see 2 options:
> 1. Strengthen validation to not allow resolving leastRestrictive type on SQL.
> 2. Making codegen support leastRestrictive type cast, such as using 
> `generateCast` instead of direct casting like `(ClassType) element`.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (FLINK-8215) Collections codegen exception when constructing Array or Map via SQL API

2017-12-07 Thread Timo Walther (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16281537#comment-16281537
 ] 

Timo Walther commented on FLINK-8215:
-

I think the Table API should not support automatic type widening. It is also 
not supported at other locations. But if the SQL standard supports that we need 
to adapt the code generation accordingly.

> Collections codegen exception when constructing Array or Map via SQL API
> 
>
> Key: FLINK-8215
> URL: https://issues.apache.org/jira/browse/FLINK-8215
> Project: Flink
>  Issue Type: Bug
>  Components: Table API & SQL
>Reporter: Rong Rong
>Assignee: Rong Rong
>
> TableAPI goes through `LogicalNode.validate()`, which brings up the 
> collection validation and rejects inconsistent type, this will throw 
> `ValidationExcpetion` for something like `array(1.0, 2.0f)`.
> SqlAPI uses `FlinkPlannerImpl.validator(SqlNode)`, which uses calcite SqlNode 
> validation, which supports resolving leastRestrictive type. `ARRAY[CAST(1 AS 
> DOUBLE), CAST(2 AS FLOAT)]` throws codegen exception.
> Root cause is the CodeGeneration for these collection value constructors does 
> not cast or resolve leastRestrictive type correctly. I see 2 options:
> 1. Strengthen validation to not allow resolving leastRestrictive type on SQL.
> 2. Making codegen support leastRestrictive type cast, such as using 
> `generateCast` instead of direct casting like `(ClassType) element`.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)