AFAIK you can use the --hadoop-major-version parameter with the spark-ec2
<https://github.com/apache/spark/blob/master/ec2/spark_ec2.py> script to
switch the hadoop version.

Thanks
Best Regards


On Wed, Jul 23, 2014 at 6:07 AM, durga <durgak...@gmail.com> wrote:

> Hi,
>
> I am trying to create spark cluster using spark-ec2 file under spark1.0.1
> directory.
>
> 1) I noticed that It is always creating hadoop version 1.0.4.Is there a
> way
> I can override that?I would like to have hadoop2.0.2
>
> 2) I also wants install Oozie along with. Is there any scrips available
> along with spark-ec2, which can create oozie instances for me.
>
> Thanks,
> D.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to