[
https://issues.apache.org/jira/browse/SPARK-8941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14623952#comment-14623952
]
Josh Rosen commented on SPARK-8941:
-----------------------------------
SGTM; do you want to open a new JIRA to follow up on the documentation issues,
plus separate issues for the other problems you've identified? If you do this,
just link the issues here and I'll close this one out. Thanks!
> Standalone cluster worker does not accept multiple masters on launch
> --------------------------------------------------------------------
>
> Key: SPARK-8941
> URL: https://issues.apache.org/jira/browse/SPARK-8941
> Project: Spark
> Issue Type: Bug
> Components: Deploy, Documentation
> Affects Versions: 1.4.0, 1.4.1
> Reporter: Jesper Lundgren
> Priority: Critical
>
> Before 1.4 it was possible to launch a worker node using a comma separated
> list of master nodes.
> ex:
> sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
> starting org.apache.spark.deploy.worker.Worker, logging to
> /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
> Default is conf/spark-defaults.conf.
> 15/07/09 12:33:06 INFO Utils: Shutdown hook called
> Spark 1.2 and 1.3.1 accepts multiple masters in this format.
> update: start-slave.sh only expects master lists in 1.4 (no instance number)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]