[
https://issues.apache.org/jira/browse/FLINK-21142?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17271953#comment-17271953
]
Rui Li commented on FLINK-21142:
--------------------------------
[~YUJIANBO] That depends on how the guava jars are used in your job. The
principle is to avoid different versions of the same guava class in the class
path. So you don't need to replace if it's not added to class path.
> Flink guava Dependence problem
> ------------------------------
>
> Key: FLINK-21142
> URL: https://issues.apache.org/jira/browse/FLINK-21142
> Project: Flink
> Issue Type: Bug
> Components: Connectors / Hadoop Compatibility, Connectors / Hive
> Affects Versions: 1.12.0
> Reporter: YUJIANBO
> Priority: Major
>
> We set up a new Hadoop cluster, and we use the flink1.12.0 compiled by the
> previous release-1.12.0 branch.If I add hive jar to flink/lib/, it will
> report errors.
> *Operating environment:*
> flink1.12.0
> Hadoop 3.3.0
> hive 3.1.2
> *Flink run official demo shell: /tmp/yjb/buildjar/flink1.12.0/bin/flink run
> -m yarn-cluster /usr/local/flink1.12.0/examples/streaming/WordCount.jar*
> If I put one of the jar *flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar* or
> *hive-exec-3.1.2.jar* in the Lib directory and execute the above shell, an
> error will be reported java.lang.NoSuchMethodError : com.google.common .
> base.Preconditions.checkArgument (ZLjava/lang/String;Ljava/lang/Object;)V.
> *We can see that it's the dependency conflict of guava.*
> *My cluster guava‘s version:*
> /usr/local/hadoop-3.3.0/share/hadoop/yarn/csi/lib/guava-20.0.jar
> /usr/local/hadoop-3.3.0/share/hadoop/common/lib/guava-27.0-jre.jar
> /usr/local/apache-hive-3.1.2-bin/lib/guava-20.0.jar
> /usr/local/apache-hive-3.1.2-bin/lib/jersey-guava-2.25.1.jar
> /usr/local/spark-3.0.1-bin-hadoop3.2/jars/guava-14.0.1.jar
> *Can you give me some advice?*
> Thank you!
--
This message was sent by Atlassian Jira
(v8.3.4#803005)