Sri

Look at the instructions here. They are for 1.5.1, but should also work for
1.6

https://www.linkedin.com/pulse/running-spark-151-cdh-deenar-toraskar-cfa?trk=hp-feed-article-title-publish&trkSplashRedir=true&forceNoSplash=true

Deenar


On 27 January 2016 at 20:16, Koert Kuipers <ko...@tresata.com> wrote:

> you need to build spark 1.6 for your hadoop distro, and put that on the
> proxy node and configure it correctly to find your cluster (hdfs and yarn).
> then use the spark-submit script for that spark 1.6 version to launch your
> application on yarn
>
> On Wed, Jan 27, 2016 at 3:11 PM, sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Koert,
>>
>> I am submitting my code (spark jar ) using spark-submit in proxy node , I
>> checked the version of the cluster and node its says 1.2 I dint really
>> understand what you mean.
>>
>> can I ask yarn to use different version of spark ? or should I say
>> override the spark_home variables to look at 1.6 spark jar ?
>>
>> Thanks
>> Sri
>>
>> On Wed, Jan 27, 2016 at 7:45 PM, Koert Kuipers <ko...@tresata.com> wrote:
>>
>>> If you have yarn you can just launch your spark 1.6 job from a single
>>> machine with spark 1.6 available on it and ignore the version of spark
>>> (1.2) that is installed
>>> On Jan 27, 2016 11:29, "kali.tumm...@gmail.com" <kali.tumm...@gmail.com>
>>> wrote:
>>>
>>>> Hi All,
>>>>
>>>> Just realized cloudera version of spark on my cluster is 1.2, the jar
>>>> which
>>>> I built using maven is version 1.6 which is causing issue.
>>>>
>>>> Is there a way to run spark version 1.6 in 1.2 version of spark ?
>>>>
>>>> Thanks
>>>> Sri
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-latest-version-of-spark-in-old-version-of-spark-in-cloudera-cluster-tp26087.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>
>>
>> --
>> Thanks & Regards
>> Sri Tummala
>>
>>
>

Reply via email to