you can start a worker process in the master node

so that all nodes in your cluster can participate in the computation

Best, 

-- 
Nan Zhu


On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:

> When spark is deployed on cluster in standalone deployment mode (V 0.81), one 
> of the node is started as master and others as workers.
> 
> What does the master node does ? Can it participates in actual computations 
> or does it just acts as coordinator ? 
> 
> Thanks,
> 
> Manoj 

Reply via email to