If you have yarn you can just launch your spark 1.6 job from a single
machine with spark 1.6 available on it and ignore the version of spark
(1.2) that is installed
On Jan 27, 2016 11:29, "kali.tumm...@gmail.com" <kali.tumm...@gmail.com>
wrote:

> Hi All,
>
> Just realized cloudera version of spark on my cluster is 1.2, the jar which
> I built using maven is version 1.6 which is causing issue.
>
> Is there a way to run spark version 1.6 in 1.2 version of spark ?
>
> Thanks
> Sri
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-latest-version-of-spark-in-old-version-of-spark-in-cloudera-cluster-tp26087.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to