[mailto:lian.cs@gmail.com]
Sent: Thursday, June 11, 2015 2:32 PM
To: Dong Lei
Cc: Dianfei (Keith) Han; dev@spark.apache.org<mailto:dev@spark.apache.org>
Subject: Re: How to support dependency jars and files on HDFS in standalone
cluster mode?
Oh sorry, I mistook --jars for --files. Yeah, for j
PM
*To:* Dong Lei
*Cc:* Dianfei (Keith) Han; dev@spark.apache.org
*Subject:* Re: How to support dependency jars and files on HDFS in
standalone cluster mode?
Oh sorry, I mistook --jars for --files. Yeah, for jars we need to add
them to classpath, which is different from regular files.
Cheng
: Re: How to support dependency jars and files on HDFS in standalone
cluster mode?
Oh sorry, I mistook --jars for --files. Yeah, for jars we need to add them to
classpath, which is different from regular files.
Cheng
On 6/11/15 2:18 PM, Dong Lei wrote:
Thanks Cheng,
If I do not use --jars how can
:* Re: How to support dependency jars and files on HDFS in
standalone cluster mode?
Since the jars are already on HDFS, you can access them directly in
your Spark application without using --jars
Cheng
On 6/11/15 11:04 AM, Dong Lei wrote:
Hi spark-dev:
I can not use a hdfs location
@gmail.com]
Sent: Thursday, June 11, 2015 12:50 PM
To: Dong Lei; dev@spark.apache.org
Cc: Dianfei (Keith) Han
Subject: Re: How to support dependency jars and files on HDFS in standalone
cluster mode?
Since the jars are already on HDFS, you can access them directly in your Spark
application without
Since the jars are already on HDFS, you can access them directly in your
Spark application without using --jars
Cheng
On 6/11/15 11:04 AM, Dong Lei wrote:
Hi spark-dev:
I can not use a hdfs location for the “--jars” or “--files” option
when doing a spark-submit in a standalone cluster mode.
Hi spark-dev:
I can not use a hdfs location for the "--jars" or "--files" option when doing a
spark-submit in a standalone cluster mode. For example:
Spark-submit ... --jars hdfs://ip/1.jar
hdfs://ip/app.jar (standalone cluster mode)
will not download 1.jar to driver's