[ 
https://issues.apache.org/jira/browse/SPARK-2008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14153770#comment-14153770
 ] 

Shivaram Venkataraman commented on SPARK-2008:
----------------------------------------------

This will be a very useful feature for spark-ec2 and is a good issue to work 
on. I think removing slaves should be relatively easy to implement as systems 
like HDFS, Spark should be resistant to slaves being removed. 

For adding slaves we'll need a new script that'll run setup-slave.sh 
https://github.com/mesos/spark-ec2/blob/v3/setup-slave.sh and bring up 
Datanodes, Spark workers etc.

> Enhance spark-ec2 to be able to add and remove slaves to an existing cluster
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-2008
>                 URL: https://issues.apache.org/jira/browse/SPARK-2008
>             Project: Spark
>          Issue Type: New Feature
>          Components: EC2
>    Affects Versions: 1.0.0
>            Reporter: Nicholas Chammas
>            Priority: Minor
>
> Per [the discussion 
> here|http://apache-spark-user-list.1001560.n3.nabble.com/Having-spark-ec2-join-new-slaves-to-existing-cluster-td3783.html]:
> {quote}
> I would like to be able to use spark-ec2 to launch new slaves and add them to 
> an existing, running cluster. Similarly, I would also like to remove slaves 
> from an existing cluster.
> Use cases include:
> * Oh snap, I sized my cluster incorrectly. Let me add/remove some slaves.
> * During scheduled batch processing, I want to add some new slaves, perhaps 
> on spot instances. When that processing is done, I want to kill them. (Cruel, 
> I know.)
> I gather this is not possible at the moment. spark-ec2 appears to be able to 
> launch new slaves for an existing cluster only if the master is stopped. I 
> also do not see any ability to remove slaves from a cluster.
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to