I see.Thanks!
On Tue, May 2, 2017 at 9:12 AM, Marcelo Vanzin wrote:
> On Tue, May 2, 2017 at 9:07 AM, Nan Zhu wrote:
> > I have no easy way to pass jar path to those forked Spark
> > applications? (except that I download jar from a remote path
On Tue, May 2, 2017 at 9:07 AM, Nan Zhu wrote:
> I have no easy way to pass jar path to those forked Spark
> applications? (except that I download jar from a remote path to a local temp
> dir after resolving some permission issues, etc.?)
Yes, that's the only way
Thanks for the reply! If I have an application master which starts some
Spark applications by forking processes (in yarn-client mode)
Essentially I have no easy way to pass jar path to those forked Spark
applications? (except that I download jar from a remote path to a local
temp dir after
Remote jars are added to executors' classpaths, but not the driver's.
In YARN cluster mode, they would also be added to the driver's class
path.
On Tue, May 2, 2017 at 8:43 AM, Nan Zhu wrote:
> Hi, all
>
> For some reason, I tried to pass in a HDFS path to the --jars
Hi, all
For some reason, I tried to pass in a HDFS path to the --jars option in
spark-submit
According to the document,
http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management,
--jars would accept remote path
However, in the implementation,