[ 
https://issues.apache.org/jira/browse/SPARK-8941?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen reopened SPARK-8941:
-------------------------------

I'm re-opening this ticket because this behavior change is a compatibility 
break for end users / distributors and should at least be documented / clearly 
communicated to them.

I believe that start-slave.sh is intended to be a public API because it’s the 
only method for manually launching a worker that is documented in the Spark 
Standalone documentation: 
https://spark.apache.org/docs/latest/spark-standalone.html

The current situation is actually pretty bad because the 1.4.0 documentation 
lists a command which won’t work (it still recommends the use of the 
out-of-date worker# argument).  Between 1.3.0 and 1.4.0, we actually removed 
the public-facing documentation of the ./bin/spark-class 
org.apache.spark.deploy.worker.Worker method, which would have actually worked 
for both versions: https://spark.apache.org/docs/1.3.0/spark-standalone.html

At a minimum, we need to correct the documentation and audit the other examples 
to make sure there aren't other issues like this.

> Standalone cluster worker does not accept multiple masters on launch
> --------------------------------------------------------------------
>
>                 Key: SPARK-8941
>                 URL: https://issues.apache.org/jira/browse/SPARK-8941
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.4.0
>            Reporter: Jesper Lundgren
>            Priority: Trivial
>
> Before 1.4 it was possible to launch a worker node using a comma separated 
> list of master nodes. 
> ex:
> sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
> starting org.apache.spark.deploy.worker.Worker, logging to 
> /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
>                              Default is conf/spark-defaults.conf.
>   15/07/09 12:33:06 INFO Utils: Shutdown hook called
> Spark 1.2 and 1.3.1 accepts multiple masters in this format.
> update: start-slave.sh only expects master lists in 1.4 (no instance number)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to