You don't need to install Spark. Just download or build a Spark package
that matches your Yarn version. And ensure that HADOOP_CONF_DIR or
YARN_CONF_DIR points to the directory which contains the (client side)
configuration files for the Hadoop cluster.

See instructions here:
http://spark.apache.org/docs/latest/running-on-yarn.html


Best Regards,
Shixiong Zhu

2015-04-30 1:00 GMT-07:00 xiaohe lan <zombiexco...@gmail.com>:

> Hi Madhvi,
>
> If I only install spark on one node, and use spark-submit to run an
> application, which are the Worker nodes? Any where are the executors ?
>
> Thanks,
> Xiaohe
>
> On Thu, Apr 30, 2015 at 12:52 PM, madhvi <madhvi.gu...@orkash.com> wrote:
>
>> Hi,
>> Follow the instructions to install on the following link:
>> http://mbonaci.github.io/mbo-spark/
>> You dont need to install spark on every node.Just install it on one node
>> or you can install it on remote system also and made a spark cluster.
>> Thanks
>> Madhvi
>>
>> On Thursday 30 April 2015 09:31 AM, xiaohe lan wrote:
>>
>>> Hi experts,
>>>
>>> I see spark on yarn has yarn-client and yarn-cluster mode. I also have a
>>> 5 nodes hadoop cluster (hadoop 2.4). How to install spark if I want to try
>>> the spark on yarn mode.
>>>
>>> Do I need to install spark on the each node of hadoop cluster ?
>>>
>>> Thanks,
>>> Xiaohe
>>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to