[ 
https://issues.apache.org/jira/browse/FLINK-20012?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Flink Jira Bot updated FLINK-20012:
-----------------------------------
      Labels: auto-deprioritized-critical auto-deprioritized-major 
auto-deprioritized-minor  (was: auto-deprioritized-critical 
auto-deprioritized-major stale-minor)
    Priority: Not a Priority  (was: Minor)

This issue was labeled "stale-minor" 7 days ago and has not received any 
updates so it is being deprioritized. If this ticket is actually Minor, please 
raise the priority and ask a committer to assign you the issue or revive the 
public discussion.


> Hive 3.1 integration exception
> ------------------------------
>
>                 Key: FLINK-20012
>                 URL: https://issues.apache.org/jira/browse/FLINK-20012
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive
>    Affects Versions: 1.11.2
>            Reporter: Dino Zhang
>            Priority: Not a Priority
>              Labels: auto-deprioritized-critical, auto-deprioritized-major, 
> auto-deprioritized-minor
>
> When add extra dependencies to the /lib directory,and config hive conf in 
> sql-client-defaults.yaml,and run /sql-client.sh embedded,But I'm getting the 
> error
> {code:java}
> Caused by: java.lang.NoSuchMethodError: 
> com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)VCaused
>  by: java.lang.NoSuchMethodError: 
> com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
>  at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) at 
> org.apache.hadoop.conf.Configuration.set(Configuration.java:1338) at 
> org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536) at 
> org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554) at 
> org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448) at 
> org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141) at 
> org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109) at 
> org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:209)
>  at 
> org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:161) 
> at 
> org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:378)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:626)
>  at java.util.HashMap.forEach(HashMap.java:1289) at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
>  at 
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
>  ... 3 more   
> {code}
> At the same time,I found the guava-18 version in flink-1.11.2, but the 
> guava-27 version in hive 3.1
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to