When spark is deployed on cluster in standalone deployment mode (V 0.81),
one of the node is started as master and others as workers.

What does the master node does ? Can it participates in actual computations
or does it just acts as coordinator ?

Thanks,

Manoj

Reply via email to