[ 
https://issues.apache.org/jira/browse/FLINK-6149?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15974354#comment-15974354
 ] 

ASF GitHub Bot commented on FLINK-6149:
---------------------------------------

Github user DmytroShkvyra commented on the issue:

    https://github.com/apache/flink/pull/3594
  
    @KurtYoung, @fhueske  This PR violated work with null nodes:
    `
    org.apache.flink.table.api.TableException: Cannot generate a valid 
execution plan for the given query: 
    
    LogicalProject(a=[$1], cnt=[$0])
      LogicalJoin(condition=[<($1, $0)], joinType=[right])
        LogicalProject(cnt=[$0])
          LogicalFilter(condition=[<($0, 0)])
            LogicalAggregate(group=[{}], cnt=[COUNT()])
              LogicalProject($f0=[0])
                LogicalTableScan(table=[[B]])
        LogicalTableScan(table=[[A]])
    
    This exception indicates that the query uses an unsupported SQL feature.
    Please check the documentation for the set of currently supported SQL 
features.
    
        at 
org.apache.flink.table.api.TableEnvironment.runVolcanoPlanner(TableEnvironment.scala:267)
        at 
org.apache.flink.table.api.BatchTableEnvironment.optimize(BatchTableEnvironment.scala:235)
        at 
org.apache.flink.table.api.BatchTableEnvironment.translate(BatchTableEnvironment.scala:265)
        at 
org.apache.flink.table.api.scala.BatchTableEnvironment.toDataSet(BatchTableEnvironment.scala:140)
        at 
org.apache.flink.table.api.scala.TableConversions.toDataSet(TableConversions.scala:40)`


> add additional flink logical relation nodes
> -------------------------------------------
>
>                 Key: FLINK-6149
>                 URL: https://issues.apache.org/jira/browse/FLINK-6149
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API & SQL
>            Reporter: Kurt Young
>            Assignee: Kurt Young
>             Fix For: 1.3.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to