Thanks for the reply! If I have an application master which starts some
Spark applications by forking processes (in yarn-client mode)

Essentially I have no easy way to pass jar path to those forked Spark
applications? (except that I download jar from a remote path to a local
temp dir after resolving some permission issues, etc.?)

On Tue, May 2, 2017 at 9:00 AM, Marcelo Vanzin <van...@cloudera.com> wrote:

> Remote jars are added to executors' classpaths, but not the driver's.
> In YARN cluster mode, they would also be added to the driver's class
> path.
>
> On Tue, May 2, 2017 at 8:43 AM, Nan Zhu <zhunanmcg...@gmail.com> wrote:
> > Hi, all
> >
> > For some reason, I tried to pass in a HDFS path to the --jars option in
> > spark-submit
> >
> > According to the document,
> > http://spark.apache.org/docs/latest/submitting-
> applications.html#advanced-dependency-management,
> > --jars would accept remote path
> >
> > However, in the implementation,
> > https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2a
> a3c508615b/core/src/main/scala/org/apache/spark/deploy/
> SparkSubmit.scala#L757,
> > it does not look like so....
> >
> > Did I miss anything?
> >
> > Best,
> >
> > Nan
>
>
>
> --
> Marcelo
>

Reply via email to