> Everything after the jar path is passed to the main class as parameters.

I don't think that is accurate if your application arguments contain double 
dashes. I've tried with several permutations of with and without '\'s and 
newlines.

Just thought I'd ask here before I have to re-configure and re-compile all my 
jars.

./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master spark://207.184.161.138:7077 \
  --deploy-mode cluster \
  --supervise \
  --executor-memory 20G \
  --total-executor-cores 100 \
  /path/to/examples.jar
  --num-decimals=1000
  --second-argument=Arg2

{
  "action" : "CreateSubmissionResponse",
  "serverSparkVersion" : "2.1.0",
  "submissionId" : "driver-20170228155848-0016",
  "success" : true
}
./test3.sh: line 15: --num-decimals=1000: command not found
./test3.sh: line 16: --second-argument=Arg2: command not found



________________________________
From: Marcelo Vanzin <van...@cloudera.com>
Sent: Tuesday, February 28, 2017 12:17:49 PM
To: Joe Olson
Cc: user@spark.apache.org
Subject: Re: spark-submit question

Everything after the jar path is passed to the main class as
parameters. So if it's not working you're probably doing something
wrong in your code (that you haven't posted).

On Tue, Feb 28, 2017 at 7:05 AM, Joe Olson <jo4...@outlook.com> wrote:
> For spark-submit, I know I can submit application level command line
> parameters to my .jar.
>
>
> However, can I prefix them with switches? My command line params are
> processed in my applications using JCommander. I've tried several variations
> of the below with no success.
>
>
> An example of what I am trying to do is below in the --num-decimals
> argument.
>
>
> ./bin/spark-submit \
>   --class org.apache.spark.examples.SparkPi \
>   --master spark://207.184.161.138:7077 \
>   --deploy-mode cluster \
>   --supervise \
>   --executor-memory 20G \
>   --total-executor-cores 100 \
>   /path/to/examples.jar \
>   --num-decimals=1000 \
>   --second-argument=Arg2
>
>



--
Marcelo

Reply via email to