Hi Sri, Each node on the cluster where spark can run will have 1.2 version of spark. If you can, you need to update the cluster to 1.6 spark. Otherwise, you can't run 1.6 on those nodes.
-honain kali.tumm...@gmail.com wrote > Hi All, > > Just realized cloudera version of spark on my cluster is 1.2, the jar > which I built using maven is version 1.6 which is causing issue. > > Is there a way to run spark version 1.6 in 1.2 version of spark ? > > Thanks > Sri -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-run-latest-version-of-spark-in-old-version-of-spark-in-cloudera-cluster-tp26087p26088.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org