[ 
https://issues.apache.org/jira/browse/FLINK-13279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16887780#comment-16887780
 ] 

Jingsong Lee commented on FLINK-13279:
--------------------------------------

At present, the executeQuery of SQL Client is implemented by 
registerTableSink(into in memory sink) + insertInto + get the result from 
InMemorySink class. The register sink is through the interface of 
*TableEnv.registerTableSink*.

*But TableEnv.registerTableSink* is registered in builtin_catalog, not in 
current_catalog. So insertInto can't find the table in current_catalog.

There are others similar issues:

https://issues.apache.org/jira/browse/FLINK-13150

https://issues.apache.org/jira/browse/FLINK-12771

Why *TableEnv.registerTableSink* is registered in builtin_catalog? Because 
flink TableSource and TableSink can not be serializeable, So them can not be 
put into HiveCatalog.

IMO, this case is important for users because it is often the first step for 
users.

[~twalthr] [~dawidwys] What do you think about this?

> not able to query table registered in catalogs in SQL CLI
> ---------------------------------------------------------
>
>                 Key: FLINK-13279
>                 URL: https://issues.apache.org/jira/browse/FLINK-13279
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / Client
>    Affects Versions: 1.9.0, 1.10.0
>            Reporter: Bowen Li
>            Assignee: Kurt Young
>            Priority: Blocker
>             Fix For: 1.9.0, 1.10.0
>
>
> When querying a simple table in catalog, SQL CLI reports 
> "org.apache.flink.table.api.TableException: No table was registered under the 
> name ArrayBuffer(default: select * from hivetable)."
> [~ykt836] can you please help to triage this ticket to proper person?
> Repro steps in SQL CLI (to set up dependencies of HiveCatalog, please refer 
> to dev/table/catalog.md):
> {code:java}
> Flink SQL> show catalogs;
> default_catalog
> myhive
> Flink SQL> use catalog myhive
> > ;
> Flink SQL> show databases;
> default
> Flink SQL> show tables;
> hivetable
> products
> test
> Flink SQL> describe hivetable;
> root
>  |-- name: STRING
>  |-- score: DOUBLE
> Flink SQL> select * from hivetable;
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.flink.table.api.TableException: No table was registered under the 
> name ArrayBuffer(default: select * from hivetable).
> {code}
> Exception in log:
> {code:java}
> 2019-07-15 14:59:12,273 WARN  org.apache.flink.table.client.cli.CliClient     
>               - Could not execute SQL statement.
> org.apache.flink.table.client.gateway.SqlExecutionException: Invalid SQL 
> query.
>       at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:485)
>       at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:317)
>       at 
> org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:469)
>       at 
> org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:291)
>       at java.util.Optional.ifPresent(Optional.java:159)
>       at org.apache.flink.table.client.cli.CliClient.open(CliClient.java:200)
>       at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:123)
>       at org.apache.flink.table.client.SqlClient.start(SqlClient.java:105)
>       at org.apache.flink.table.client.SqlClient.main(SqlClient.java:194)
> Caused by: org.apache.flink.table.api.TableException: No table was registered 
> under the name ArrayBuffer(default: select * from hivetable).
>       at 
> org.apache.flink.table.api.internal.TableEnvImpl.insertInto(TableEnvImpl.scala:529)
>       at 
> org.apache.flink.table.api.internal.TableEnvImpl.insertInto(TableEnvImpl.scala:507)
>       at 
> org.apache.flink.table.api.internal.BatchTableEnvImpl.insertInto(BatchTableEnvImpl.scala:58)
>       at 
> org.apache.flink.table.api.internal.TableImpl.insertInto(TableImpl.java:428)
>       at 
> org.apache.flink.table.api.internal.TableImpl.insertInto(TableImpl.java:416)
>       at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeQueryInternal$10(LocalExecutor.java:476)
>       at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:202)
>       at 
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:474)
>       ... 8 more
> {code}
> However, {{select * from myhive.`default`.hivetable;}} seems to work well
> Also note this is tested with changes in 
> https://github.com/apache/flink/pull/9049



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to