[
https://issues.apache.org/jira/browse/SPARK-1350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14237914#comment-14237914
]
Thomas Graves commented on SPARK-1350:
--------------------------------------
which version of hadoop are you using? Spark is using
Environment.JAVA_HOME.$() but Environment is a hadoop class and its supposed to
handle windows:
public String $() {
if (Shell.WINDOWS) {
return "%" + variable + "%";
} else {
return "$" + variable;
}
}
> YARN ContainerLaunchContext should use cluster's JAVA_HOME
> ----------------------------------------------------------
>
> Key: SPARK-1350
> URL: https://issues.apache.org/jira/browse/SPARK-1350
> Project: Spark
> Issue Type: Bug
> Components: YARN
> Affects Versions: 0.9.0
> Reporter: Sandy Ryza
> Assignee: Sandy Ryza
> Fix For: 1.0.0
>
>
> {code}
> var javaCommand = "java"
> val javaHome = System.getenv("JAVA_HOME")
> if ((javaHome != null && !javaHome.isEmpty()) ||
> env.isDefinedAt("JAVA_HOME")) {
> javaCommand = Environment.JAVA_HOME.$() + "/bin/java"
> }
> {code}
> Currently, if JAVA_HOME is specified on the client, it will be used instead
> of the value given on the cluster. This makes it so that Java must be
> installed in the same place on the client as on the cluster.
> This is a possibly incompatible change that we should get in before 1.0.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]