[
https://issues.apache.org/jira/browse/FLINK-11972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16797915#comment-16797915
]
Yu Li commented on FLINK-11972:
-------------------------------
Sure, let me open a JIRA for the left over issue. Thanks. [~sunjincheng121]
> The classpath is missing the `flink-shaded-hadoop2-uber-2.8.3-1.8.0.jar` JAR
> during the end-to-end test.
> --------------------------------------------------------------------------------------------------------
>
> Key: FLINK-11972
> URL: https://issues.apache.org/jira/browse/FLINK-11972
> Project: Flink
> Issue Type: Bug
> Components: Tests
> Affects Versions: 1.8.0, 1.9.0
> Reporter: sunjincheng
> Assignee: Yu Li
> Priority: Major
> Labels: pull-request-available
> Fix For: 1.8.0, 1.9.0
>
> Attachments: image-2019-03-21-06-26-49-787.png, screenshot-1.png
>
> Time Spent: 20m
> Remaining Estimate: 0h
>
> Since the difference between 1.8.0 and 1.7.x is that 1.8.x does not put the
> `hadoop-shaded` JAR integrated into the dist. It will cause an error when
> the end-to-end test cannot be found with `Hadoop` Related classes, such as:
> `java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem`. So we
> need to improve the end-to-end test script, or explicitly stated in the
> README, i.e. end-to-end test need to add `flink-shaded-hadoop2-uber-XXXX.jar`
> to the classpath. So, we will get the exception something like:
> {code:java}
> [INFO] 3 instance(s) of taskexecutor are already running on
> jinchengsunjcs-iMac.local.
> Starting taskexecutor daemon on host jinchengsunjcs-iMac.local.
> java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem;
> at java.lang.Class.getDeclaredFields0(Native Method)
> at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
> at java.lang.Class.getDeclaredFields(Class.java:1916)
> at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:72)
> at
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1558)
> at
> org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:185)
> at
> org.apache.flink.streaming.api.datastream.DataStream.addSink(DataStream.java:1227)
> at
> org.apache.flink.streaming.tests.BucketingSinkTestProgram.main(BucketingSinkTestProgram.java:80)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
> at
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
> at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:423)
> at
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
> at
> org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 22 more
> Job () is running.{code}
> So, I think we can import the test script or improve the README.
> What do you think?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)