For a hadoop cluster with N data nodes, I am deploying a spark cluster with
N worker server (each colocated with data node).

Not sure if it is OK to start a master server on one of the worker servers
or master server is demanding enough that it should be located on a node
separate that any of the worker nodes.

Thanks,

Reply via email to