Hi,

you have to specify the worker nodes of the spark cluster at the time of configurations of the cluster.

Thanks
Madhvi
On Thursday 30 April 2015 01:30 PM, xiaohe lan wrote:
Hi Madhvi,

If I only install spark on one node, and use spark-submit to run an application, which are the Worker nodes? Any where are the executors ?

Thanks,
Xiaohe

On Thu, Apr 30, 2015 at 12:52 PM, madhvi <madhvi.gu...@orkash.com <mailto:madhvi.gu...@orkash.com>> wrote:

    Hi,
    Follow the instructions to install on the following link:
    http://mbonaci.github.io/mbo-spark/
    You dont need to install spark on every node.Just install it on
    one node or you can install it on remote system also and made a
    spark cluster.
    Thanks
    Madhvi

    On Thursday 30 April 2015 09:31 AM, xiaohe lan wrote:

        Hi experts,

        I see spark on yarn has yarn-client and yarn-cluster mode. I
        also have a 5 nodes hadoop cluster (hadoop 2.4). How to
        install spark if I want to try the spark on yarn mode.

        Do I need to install spark on the each node of hadoop cluster ?

        Thanks,
        Xiaohe



    ---------------------------------------------------------------------
    To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
    <mailto:user-unsubscr...@spark.apache.org>
    For additional commands, e-mail: user-h...@spark.apache.org
    <mailto:user-h...@spark.apache.org>



Reply via email to