[ 
https://issues.apache.org/jira/browse/SPARK-2454?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14064036#comment-14064036
 ] 

Andrew Or commented on SPARK-2454:
----------------------------------

There may be multiple installations of Spark on the executor machine, in which 
case a global SPARK_HOME environment variable is not sufficient. What I am 
suggesting is that we should still keep the option of allowing the driver to 
overwrite executor spark homes (spark.executor.home), and only overwrite the 
executor SPARK_HOME if this is specified.

> Separate driver spark home from executor spark home
> ---------------------------------------------------
>
>                 Key: SPARK-2454
>                 URL: https://issues.apache.org/jira/browse/SPARK-2454
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: Andrew Or
>             Fix For: 1.1.0
>
>
> The driver may not always share the same directory structure as the 
> executors. It makes little sense to always re-use the driver's spark home on 
> the executors.
> https://github.com/apache/spark/pull/1244/ is an open effort to fix this. 
> However, this still requires us to set SPARK_HOME on all the executor nodes. 
> Really we should separate this out into something like
> spark.driver.home
> spark.executor.home
> rather than re-using SPARK_HOME everywhere.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to