In this page http://spark.apache.org/docs/0.9.0/running-on-yarn.html

We have to use spark assembly to submit spark apps to yarn cluster.
And I checked the assembly jars of spark. It contains some yarn classes
which are added during compile time. The yarn classes are not what I want.

My question is that is it possible to use other jars to submit spark app to
yarn cluster.
I do not want to use the assembly jar because it has yarn classes which may
overwrite the yarn class in HADOOP_CLASSPATH. If the yarn cluster is
upgraded, even if the YARN apis are same, spark has to be recompiled
against to the new version of yarn.

Any help is appreciated ! Thanks.

-- 
Regards
Gordon Wang

Reply via email to