[ 
https://issues.apache.org/jira/browse/SPARK-967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-967.
-----------------------------
    Resolution: Not A Problem

I think this is obsolete, or no longer a problem; these scripts always respond 
to the local SPARK_HOME now.

> start-slaves.sh uses local path from master on remote slave nodes
> -----------------------------------------------------------------
>
>                 Key: SPARK-967
>                 URL: https://issues.apache.org/jira/browse/SPARK-967
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 0.8.0, 0.8.1, 0.9.0
>            Reporter: Evgeniy Tsvigun
>            Priority: Trivial
>              Labels: script, starter
>
> If a slave node has home path other than master, start-slave.sh fails to 
> start a worker instance, for other nodes behaves as expected, in my case: 
>     $ ./bin/start-slaves.sh 
>     node05.dev.vega.ru: bash: line 0: cd: /usr/home/etsvigun/spark/bin/..: No 
> such file or directory
>     node04.dev.vega.ru: org.apache.spark.deploy.worker.Worker running as 
> process 4796. Stop it first.
>     node03.dev.vega.ru: org.apache.spark.deploy.worker.Worker running as 
> process 61348. Stop it first.
> I don't mention /usr/home anywhere, the only environment variable I set is 
> $SPARK_HOME, relative to $HOME on every node, which makes me think some 
> script takes `pwd` on master and tries to use it on slaves. 
> Spark version: fb6875dd5c9334802580155464cef9ac4d4cc1f0
> OS:  FreeBSD 8.4



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to