[
https://issues.apache.org/jira/browse/SPARK-1350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14237525#comment-14237525
]
Aniket Bhatnagar commented on SPARK-1350:
-----------------------------------------
[~sandyr] Using Environment.JAVA_HOME.$() causes issues while submitting spark
applications from a windows box into a Yarn cluster running on Linux (with
spark master set as yarn-client). This is because Environment.JAVA_HOME.$()
resolves to %JAVA_HOME% which results in not a valid executable on Linux. Is
this a Spark issue or a YARN issue?
> YARN ContainerLaunchContext should use cluster's JAVA_HOME
> ----------------------------------------------------------
>
> Key: SPARK-1350
> URL: https://issues.apache.org/jira/browse/SPARK-1350
> Project: Spark
> Issue Type: Bug
> Components: YARN
> Affects Versions: 0.9.0
> Reporter: Sandy Ryza
> Assignee: Sandy Ryza
> Fix For: 1.0.0
>
>
> {code}
> var javaCommand = "java"
> val javaHome = System.getenv("JAVA_HOME")
> if ((javaHome != null && !javaHome.isEmpty()) ||
> env.isDefinedAt("JAVA_HOME")) {
> javaCommand = Environment.JAVA_HOME.$() + "/bin/java"
> }
> {code}
> Currently, if JAVA_HOME is specified on the client, it will be used instead
> of the value given on the cluster. This makes it so that Java must be
> installed in the same place on the client as on the cluster.
> This is a possibly incompatible change that we should get in before 1.0.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]