[ 
https://issues.apache.org/jira/browse/SPARK-693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Albert Chu updated SPARK-693:
-----------------------------

    Attachment: SPARK-693.patch

We required this support in our environment.  Attached is my patch to implement 
this for Spark 1.0.0.  Git pull request will be sent too.

> Let deploy scripts set alternate conf, work directories
> -------------------------------------------------------
>
>                 Key: SPARK-693
>                 URL: https://issues.apache.org/jira/browse/SPARK-693
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 0.6.2
>            Reporter: David Chiang
>            Priority: Minor
>         Attachments: SPARK-693.patch
>
>
> Currently SPARK_CONF_DIR is overridden in spark-config.sh, and 
> start-slaves.sh doesn't allow the user to pass a -d option in to set the work 
> directory. Allowing this is a small change and makes it possible to have 
> multiple clusters running at once.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to