[ 
https://issues.apache.org/jira/browse/SPARK-2454?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14063402#comment-14063402
 ] 

Nan Zhu commented on SPARK-2454:
--------------------------------

this will make sparkHome as an application-specific parameter explicitly, I 
just thought it will confuse the user since sparkHome is actually a global 
setup for all application/executors run on the same machine


The good thing here is it can support the user to run the application in 
different version of spark sharing the same cluster.....(especially when you 
are doing spark dev work) 

> Separate driver spark home from executor spark home
> ---------------------------------------------------
>
>                 Key: SPARK-2454
>                 URL: https://issues.apache.org/jira/browse/SPARK-2454
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: Andrew Or
>             Fix For: 1.1.0
>
>
> The driver may not always share the same directory structure as the 
> executors. It makes little sense to always re-use the driver's spark home on 
> the executors.
> https://github.com/apache/spark/pull/1244/ is an open effort to fix this. 
> However, this still requires us to set SPARK_HOME on all the executor nodes. 
> Really we should separate this out into something like
> spark.driver.home
> spark.executor.home
> rather than re-using SPARK_HOME everywhere.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to