[ 
https://issues.apache.org/jira/browse/FLINK-24942?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17445721#comment-17445721
 ] 

Shengkai Fang commented on FLINK-24942:
---------------------------------------

In the doc, it suggests to use catalog to manage tables in hive. I think the 
issue is not sql client's problem. Do you mind to close it? 

> Could not find any factory for identifier 'hive' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath
> ----------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: FLINK-24942
>                 URL: https://issues.apache.org/jira/browse/FLINK-24942
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / Client
>    Affects Versions: 1.14.0
>         Environment: Flink-1.14.0
>            Reporter: JasonLee
>            Priority: Major
>             Fix For: 1.15.0
>
>
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.flink.table.api.ValidationException: Could not find any factory 
> for identifier 'hive' that implements 
> 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
> Available factory identifiers are:
> blackhole
> datagen
> filesystem
> kafka
> print
> upsert-kafka
>  
> The above exception is thrown when I execute the following SQL, even though I 
> have added flink-sql-connector-hive-2.3.6_2.11-1.14.0.jar in flink/lib
> {code:java}
> // code placeholder
> insert into fs_table
> select xxx, 
> xxx, 
> xxx, 
> xxx, 
> xxx, 
> DATE_FORMAT(ts_ltz, 'yyyy-MM-dd'), DATE_FORMAT(ts_ltz, 'HH')
> from kafka_table; {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to