Re: --jars does not take remote jar?
I see.Thanks! On Tue, May 2, 2017 at 9:12 AM, Marcelo Vanzinwrote: > On Tue, May 2, 2017 at 9:07 AM, Nan Zhu wrote: > > I have no easy way to pass jar path to those forked Spark > > applications? (except that I download jar from a remote path to a local > temp > > dir after resolving some permission issues, etc.?) > > Yes, that's the only way currently in client mode. > > -- > Marcelo >
Re: --jars does not take remote jar?
On Tue, May 2, 2017 at 9:07 AM, Nan Zhuwrote: > I have no easy way to pass jar path to those forked Spark > applications? (except that I download jar from a remote path to a local temp > dir after resolving some permission issues, etc.?) Yes, that's the only way currently in client mode. -- Marcelo - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: --jars does not take remote jar?
Thanks for the reply! If I have an application master which starts some Spark applications by forking processes (in yarn-client mode) Essentially I have no easy way to pass jar path to those forked Spark applications? (except that I download jar from a remote path to a local temp dir after resolving some permission issues, etc.?) On Tue, May 2, 2017 at 9:00 AM, Marcelo Vanzinwrote: > Remote jars are added to executors' classpaths, but not the driver's. > In YARN cluster mode, they would also be added to the driver's class > path. > > On Tue, May 2, 2017 at 8:43 AM, Nan Zhu wrote: > > Hi, all > > > > For some reason, I tried to pass in a HDFS path to the --jars option in > > spark-submit > > > > According to the document, > > http://spark.apache.org/docs/latest/submitting- > applications.html#advanced-dependency-management, > > --jars would accept remote path > > > > However, in the implementation, > > https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2a > a3c508615b/core/src/main/scala/org/apache/spark/deploy/ > SparkSubmit.scala#L757, > > it does not look like so > > > > Did I miss anything? > > > > Best, > > > > Nan > > > > -- > Marcelo >
Re: --jars does not take remote jar?
Remote jars are added to executors' classpaths, but not the driver's. In YARN cluster mode, they would also be added to the driver's class path. On Tue, May 2, 2017 at 8:43 AM, Nan Zhuwrote: > Hi, all > > For some reason, I tried to pass in a HDFS path to the --jars option in > spark-submit > > According to the document, > http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management, > --jars would accept remote path > > However, in the implementation, > https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2aa3c508615b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L757, > it does not look like so > > Did I miss anything? > > Best, > > Nan -- Marcelo - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
--jars does not take remote jar?
Hi, all For some reason, I tried to pass in a HDFS path to the --jars option in spark-submit According to the document, http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management, --jars would accept remote path However, in the implementation, https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2aa3c508615b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L757, it does not look like so Did I miss anything? Best, Nan