[
https://issues.apache.org/jira/browse/SPARK-10713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428264#comment-15428264
]
Wolfgang Buchner commented on SPARK-10713:
------------------------------------------
i am currently testing spark 2.0 with mesos and it seems i am having the same
issue. The Executor can't start because he is missing the hadoop libraries, but
they should have been resolved via SPARK_DIST_CLASSPATH
> SPARK_DIST_CLASSPATH ignored on Mesos executors
> -----------------------------------------------
>
> Key: SPARK-10713
> URL: https://issues.apache.org/jira/browse/SPARK-10713
> Project: Spark
> Issue Type: Bug
> Components: Deploy, Mesos
> Affects Versions: 1.5.0
> Reporter: Dara Adib
> Priority: Minor
>
> If I set the environment variable SPARK_DIST_CLASSPATH, the jars are included
> on the driver, but not on Mesos executors. Docs:
> https://spark.apache.org/docs/latest/hadoop-provided.html
> I see SPARK_DIST_CLASSPATH mentioned in these files:
> launcher/src/main/java/org/apache/spark/launcher/AbstractCommandBuilder.java
> project/SparkBuild.scala
> yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
> But not the Mesos executor (or should it be included by the launcher
> library?):
> spark/core/src/main/scala/org/apache/spark/executor/Executor.scala
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]