大家好:
        
我的flink代码打包的jar包是放到了hdfs上面,但是当我在flink中用命令行执行的时候,flink本地是否只能解析本地jar包?不能解析到hdfs上面的jar包?

我把jar包下载到服务器本地后,就可以执行成功了


我的命令是:
./bin/flink run  -yid application_1567652112073_0001   -p 6 -yj 
hdfs://ysec-storage/flink/runJar/business-security-1.0-SNAPSHOT.jar --appId 
act_test



返回的结果是:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/data/flink/flink-1.9.0/lib/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/hdp/2.6.5.0-292/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-09-29 11:48:15,686 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli     
            - Found Yarn properties file under /tmp/.yarn-properties-hdfs.
2019-09-29 11:48:15,686 INFO  org.apache.flink.yarn.cli.FlinkYarnSessionCli     
            - Found Yarn properties file under /tmp/.yarn-properties-hdfs.
Could not build the program from JAR file.

回复