[ 
https://issues.apache.org/jira/browse/FLINK-32464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yunfeng Zhou updated FLINK-32464:
---------------------------------
    Description: 
In an attempt to convert table between Table API and SQL API using the 
following program


{code:java}
    public static void main(String[] args) {
        StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);

        Table table = tEnv.fromValues(1, 2, 3);

        tEnv.createTemporaryView("input_table", table);
        table = tEnv.sqlQuery("SELECT MAP[f0, 1] AS f1 from input_table");

        table = table.select($("f1").cast(DataTypes.MAP(DataTypes.INT(), 
DataTypes.INT())));

        tEnv.createTemporaryView("input_table_2", table);
        tEnv.sqlQuery("SELECT * from input_table_2");
    }
{code}

The following exception is thrown.


{code}
Exception in thread "main" java.lang.AssertionError: Conversion to relational 
algebra failed to preserve datatypes:
validated type:
RecordType((INTEGER, INTEGER) MAP NOT NULL f1-MAP<INT, INT>) NOT NULL
converted type:
RecordType((INTEGER, INTEGER) MAP f1-MAP<INT, INT>) NOT NULL
rel:
LogicalProject(f1-MAP<INT, INT>=[CAST(MAP($0, 1)):(INTEGER, INTEGER) MAP])
  LogicalValues(tuples=[[{ 1 }, { 2 }, { 3 }]])

        at 
org.apache.calcite.sql2rel.SqlToRelConverter.checkConvertedType(SqlToRelConverter.java:470)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:582)
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
        at 
org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:703)
        at org.apache.flink.streaming.connectors.redis.RedisSinkITCase.main
{code}

It seems that there is a bug with the Table-SQL conversion and selection 
process when type cast is involved.

  was:
In an attempt to convert table between Table API and SQL API using the 
following program

```java
    public static void main(String[] args) {
        StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);

        Table table = tEnv.fromValues(1, 2, 3);

        tEnv.createTemporaryView("input_table", table);
        table = tEnv.sqlQuery("SELECT MAP[f0, 1] AS f1 from input_table");

        table = table.select($("f1").cast(DataTypes.MAP(DataTypes.INT(), 
DataTypes.INT())));

        tEnv.createTemporaryView("input_table_2", table);
        tEnv.sqlQuery("SELECT * from input_table_2");
    }
```
The following exception is thrown.

```
Exception in thread "main" java.lang.AssertionError: Conversion to relational 
algebra failed to preserve datatypes:
validated type:
RecordType((INTEGER, INTEGER) MAP NOT NULL f1-MAP<INT, INT>) NOT NULL
converted type:
RecordType((INTEGER, INTEGER) MAP f1-MAP<INT, INT>) NOT NULL
rel:
LogicalProject(f1-MAP<INT, INT>=[CAST(MAP($0, 1)):(INTEGER, INTEGER) MAP])
  LogicalValues(tuples=[[{ 1 }, { 2 }, { 3 }]])

        at 
org.apache.calcite.sql2rel.SqlToRelConverter.checkConvertedType(SqlToRelConverter.java:470)
        at 
org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:582)
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
        at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
        at 
org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
        at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:703)
        at org.apache.flink.streaming.connectors.redis.RedisSinkITCase.main
```

It seems that there is a bug with the Table-SQL conversion and selection 
process when type cast is involved.


> AssertionError when converting between Table and SQL with selection and type 
> cast
> ---------------------------------------------------------------------------------
>
>                 Key: FLINK-32464
>                 URL: https://issues.apache.org/jira/browse/FLINK-32464
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / API
>    Affects Versions: 1.16.1
>            Reporter: Yunfeng Zhou
>            Priority: Major
>
> In an attempt to convert table between Table API and SQL API using the 
> following program
> {code:java}
>     public static void main(String[] args) {
>         StreamExecutionEnvironment env = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>         StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);
>         Table table = tEnv.fromValues(1, 2, 3);
>         tEnv.createTemporaryView("input_table", table);
>         table = tEnv.sqlQuery("SELECT MAP[f0, 1] AS f1 from input_table");
>         table = table.select($("f1").cast(DataTypes.MAP(DataTypes.INT(), 
> DataTypes.INT())));
>         tEnv.createTemporaryView("input_table_2", table);
>         tEnv.sqlQuery("SELECT * from input_table_2");
>     }
> {code}
> The following exception is thrown.
> {code}
> Exception in thread "main" java.lang.AssertionError: Conversion to relational 
> algebra failed to preserve datatypes:
> validated type:
> RecordType((INTEGER, INTEGER) MAP NOT NULL f1-MAP<INT, INT>) NOT NULL
> converted type:
> RecordType((INTEGER, INTEGER) MAP f1-MAP<INT, INT>) NOT NULL
> rel:
> LogicalProject(f1-MAP<INT, INT>=[CAST(MAP($0, 1)):(INTEGER, INTEGER) MAP])
>   LogicalValues(tuples=[[{ 1 }, { 2 }, { 3 }]])
>       at 
> org.apache.calcite.sql2rel.SqlToRelConverter.checkConvertedType(SqlToRelConverter.java:470)
>       at 
> org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:582)
>       at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:215)
>       at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:191)
>       at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1498)
>       at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1253)
>       at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convertValidatedSqlNode(SqlToOperationConverter.java:374)
>       at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:262)
>       at 
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:106)
>       at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:703)
>       at org.apache.flink.streaming.connectors.redis.RedisSinkITCase.main
> {code}
> It seems that there is a bug with the Table-SQL conversion and selection 
> process when type cast is involved.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to