Hi All,

I am working with Spark to add new slaves automatically when there is more data 
to be processed by the cluster. During this process there is question arisen, 
after adding/removing new slave node to/from the spark cluster do we need to 
restart master and other existing slaves in the cluster?

>From my observations:

1.       If a new slave node details are added in configuration 
files(/root/spark/conf/salves) on master node , running "start-slaves.sh" 
script will add the new slave to cluster without affecting  existing slaves or 
master.

2.       If a slave details are removed from the configuration file, one need 
to restart master using stop-master.sh and start-master.sh scripts to take 
effect.

Is there any reload option available in Spark to load the changed configuration 
files without stopping the services. Here stopping the service of master or 
existing salves may lead to outage of services.
You can find the options available to start/stop the services of spark at 
http://spark.apache.org/docs/latest/spark-standalone.html


Thanks & Regards,
Sirisha Devineni.

DISCLAIMER
==========
This e-mail may contain privileged and confidential information which is the 
property of Persistent Systems Ltd. It is intended only for the use of the 
individual or entity to which it is addressed. If you are not the intended 
recipient, you are not authorized to read, retain, copy, print, distribute or 
use this message. If you have received this communication in error, please 
notify the sender and delete all copies of this message. Persistent Systems 
Ltd. does not accept any liability for virus infected mails.

Reply via email to