[
https://issues.apache.org/jira/browse/FLINK-2268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16155039#comment-16155039
]
ASF GitHub Bot commented on FLINK-2268:
---------------------------------------
Github user aljoscha commented on a diff in the pull request:
https://github.com/apache/flink/pull/4636#discussion_r137211750
--- Diff:
flink-runtime/src/main/java/org/apache/flink/runtime/util/EnvironmentInformation.java
---
@@ -284,7 +284,16 @@ public static void logEnvironmentInfo(Logger log,
String componentName, String[]
log.info(" JVM: " + jvmVersion);
log.info(" Maximum heap size: " + maxHeapMegabytes + "
MiBytes");
log.info(" JAVA_HOME: " + (javaHome == null ? "(not
set)" : javaHome));
- log.info(" Hadoop version: " +
VersionInfo.getVersion());
+
+ try {
+ Class.forName(
+ "org.apache.hadoop.util.VersionInfo",
+ false,
+
EnvironmentInformation.class.getClassLoader());
+ log.info(" Hadoop version: " +
VersionInfo.getVersion());
--- End diff --
Yes, that is intended because I didn't want to fiddle with the reflection
API. Ideally, I would like to do this:
```
try {
log.info(" Hadoop version: " + VersionInfo.getVersion());
} catch (ClassNotFoundException e) {
// ignore
}
```
but java won't let you do this. With the explicit `Class.forName()` it will
let me put the catch block.
> Provide Flink binary release without Hadoop
> -------------------------------------------
>
> Key: FLINK-2268
> URL: https://issues.apache.org/jira/browse/FLINK-2268
> Project: Flink
> Issue Type: Improvement
> Components: Build System
> Reporter: Robert Metzger
> Assignee: Aljoscha Krettek
>
> Currently, all Flink releases ship with Hadoop 2.3.0 binaries.
> The big Hadoop distributions are usually not relying on vanilla Hadoop
> releases, but on custom patched versions.
> To provide the best user experience, we should offer a Flink binary that uses
> the Hadoop jars provided by the user (=hadoop distribution)
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)