[
https://issues.apache.org/jira/browse/SPARK-904?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14212078#comment-14212078
]
Andrew Ash commented on SPARK-904:
----------------------------------
[~ayushmishra2005] I suspect you don't have Spark installed on the remote
machine -- the {{start-all.sh}} script won't install it for you on remote
machines.
If you're still having trouble, please reach out to the spark users list from
http://spark.apache.org/community.html which is a better place for these kinds
of requests anyway. I'm closing this issue for now but let me know here if you
aren't able to get a resolution on the mailing lists.
Thanks, and good luck with Spark!
Andrew
> Not able to Start/Stop Spark Worker from Remote Machine
> -------------------------------------------------------
>
> Key: SPARK-904
> URL: https://issues.apache.org/jira/browse/SPARK-904
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 0.7.3
> Reporter: Ayush
>
> I have two machines A and B. I am trying to run Spark Master on machine A and
> Spark Worker on machine B.
> I have set machine B'host name in conf/slaves in my Spark directory.
> When I am executing start-all.sh to start master and workers, I am getting
> below message on console:
> abc@abc-vostro:~/spark-scala-2.10$ sudo sh bin/start-all.sh
> sudo: /etc/sudoers.d is world writable
> starting spark.deploy.master.Master, logging to
> /home/abc/spark-scala-2.10/bin/../logs/spark-root-spark.deploy.master.Master-1-abc-vostro.out
> 13/09/11 14:54:29 WARN spark.Utils: Your hostname, abc-vostro resolves to a
> loopback address: 127.0.1.1; using 1XY.1XY.Y.Y instead (on interface wlan2)
> 13/09/11 14:54:29 WARN spark.Utils: Set SPARK_LOCAL_IP if you need to bind to
> another address
> Master IP: abc-vostro
> cd /home/abc/spark-scala-2.10/bin/.. ;
> /home/abc/spark-scala-2.10/bin/start-slave.sh 1 spark://abc-vostro:7077
> [email protected]'s password:
> [email protected]: bash: line 0: cd: /home/abc/spark-scala-2.10/bin/..: No such
> file or directory
> [email protected]: bash: /home/abc/spark-scala-2.10/bin/start-slave.sh: No such
> file or directory
> Master is started but worker is failed to start.
> I have set [email protected] in conf/slaves in my Spark directory.
> Can anyone help me to resolve this? This is probably something I'm missing
> any configuration on my end.
> However When I create Spark Master and Worker on same machine, It is working
> fine.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]