Hello, 请问在flink 1.10.0 on yarn提交job出现此问题是什么原因,hadoop
jar包依赖吗?该程序在1.10以下的版本均可运行,在1.10.0无法提交。

谢谢!
================================================================

[jacob@hadoop001 bin]$ ./yarn logs -applicationId
application_1603495749855_57650
20/12/11 18:52:55 INFO client.RMProxy: Connecting to ResourceManager at
localhost:8032
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/opt/app/hadoop_client/e11_backend/hadoop-2.6.0-cdh5.8.3/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/app/hadoop-2.6.0-cdh5.8.3/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
20/12/11 18:52:57 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable


Container: container_1603495749855_57650_02_000001 on localhost
=====================================================================================
LogType:jobmanager.err
Log Upload Time:Fri Dec 11 18:49:21 -0800 2020
LogLength:2368
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/data/hadoop/dn/sdc/yarn/nm/usercache/jacob/appcache/application_1603495749855_57650/filecache/11/datafeed-website-filter_flink-0.0.1-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/data/hadoop/dn/sde/yarn/nm/usercache/jacob/appcache/application_1603495749855_57650/filecache/17/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p0.2/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type
[ch.qos.logback.classic.util.ContextSelectorStaticBinder]
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.hadoop.conf.Configuration.addDeprecations([Lorg/apache/hadoop/conf/Configuration$DeprecationDelta;)V
        at
org.apache.hadoop.mapreduce.util.ConfigUtil.addDeprecatedKeys(ConfigUtil.java:54)
        at
org.apache.hadoop.mapreduce.util.ConfigUtil.loadResources(ConfigUtil.java:42)
        at org.apache.hadoop.mapred.JobConf.<clinit>(JobConf.java:119)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1659)
        at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:91)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.security.Groups.<init>(Groups.java:55)
        at
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:182)
        at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:235)
        at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214)
        at
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
        at
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571)
        at
org.apache.flink.yarn.entrypoint.YarnEntrypointUtils.logYarnEnvironmentInformation(YarnEntrypointUtils.java:136)
        at
org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint.main(YarnJobClusterEntrypoint.java:109)



--
Sent from: http://apache-flink.147419.n8.nabble.com/

回复