问题已经解决:
1、需要  flink-sql-connector-hive-2.3.6_2.11-1.12.0.jar  
去掉flink-connector-hive_2.11-1.12.0.jar和hive-exec-2.3.4.jar
2、不光要重启SQL Client 还需要重启本地集群

















在 2020-12-30 18:32:29,"hailongwang" <[email protected]> 写道:
>你在启动之后才把 jar 包放进去的吗,重启下 SQL Client 试试?
>
>
>
>
>在 2020-12-30 15:26:59,"jiangjiguang719" <[email protected]> 写道:
>>使用 SQL Client,进行hive查询时报错:
>>命名有了flink-connector-hive_2.11-1.12.0.jar,还是报java.lang.ClassNotFoundException: 
>>org.apache.flink.connectors.hive.HiveSource
>>麻烦看一下
>>
>>
>>报错信息:
>>
>>Flink SQL> select count(*) from zxw_test_1225_01;
>>2020-12-30 16:20:42,518 WARN  org.apache.hadoop.hive.conf.HiveConf            
>>             [] - HiveConf of name hive.spark.client.submit.timeout.interval 
>>does not exist
>>2020-12-30 16:20:42,519 WARN  org.apache.hadoop.hive.conf.HiveConf            
>>             [] - HiveConf of name hive.support.sql11.reserved.keywords does 
>>not exist
>>2020-12-30 16:20:42,520 WARN  org.apache.hadoop.hive.conf.HiveConf            
>>             [] - HiveConf of name 
>>hive.spark.client.rpc.server.address.use.ip does not exist
>>2020-12-30 16:20:42,520 WARN  org.apache.hadoop.hive.conf.HiveConf            
>>             [] - HiveConf of name hive.enforce.bucketing does not exist
>>2020-12-30 16:20:42,520 WARN  org.apache.hadoop.hive.conf.HiveConf            
>>             [] - HiveConf of name hive.server2.enable.impersonation does not 
>>exist
>>2020-12-30 16:20:42,520 WARN  org.apache.hadoop.hive.conf.HiveConf            
>>             [] - HiveConf of name hive.run.timeout.seconds does not exist
>>2020-12-30 16:20:43,065 WARN  
>>org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory      [] - The 
>>short-circuit local reads feature cannot be used because libhadoop cannot be 
>>loaded.
>>2020-12-30 16:20:43,245 INFO  org.apache.hadoop.mapred.FileInputFormat        
>>             [] - Total input files to process : 24
>>[ERROR] Could not execute SQL statement. Reason:
>>java.lang.ClassNotFoundException: org.apache.flink.connectors.hive.HiveSource
>>
>>
>>lib包:
>># tree lib
>>lib
>>├── flink-connector-hive_2.11-1.12.0.jar
>>├── flink-csv-1.12.0.jar
>>├── flink-dist_2.11-1.12.0.jar
>>├── flink-hadoop-compatibility_2.11-1.12.0.jar
>>├── flink-json-1.12.0.jar
>>├── flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
>>├── flink-shaded-zookeeper-3.4.14.jar
>>├── flink-table_2.11-1.12.0.jar
>>├── flink-table-blink_2.11-1.12.0.jar
>>├── hive-exec-2.3.4.jar
>>├── log4j-1.2-api-2.12.1.jar
>>├── log4j-api-2.12.1.jar
>>├── log4j-core-2.12.1.jar
>>└── log4j-slf4j-impl-2.12.1.jar

回复