target can have its own
command line options.
prajod
From: bit1...@163.com [mailto:bit1...@163.com]
Sent: 19 June 2015 12:36
To: Akhil Das; Prajod S Vettiyattil (WT01 - BAS)
Cc: user
Subject: Re: Re: Build spark application into uber jar
Thank you Akhil.
Hmm.. but I am using Maven
: RE: Re: Build spark application into uber jar
Hi,
When running inside Eclipse IDE, I use another maven target to build. That is
the default maven target. For building for uber jar. I use the assembly jar
target.
So use two maven build targets in the same pom file to solve this issue.
In maven
Sure, Thanks Projod for the detailed steps!
bit1...@163.com
From: prajod.vettiyat...@wipro.com
Date: 2015-06-19 16:56
To: bit1...@163.com; ak...@sigmoidanalytics.com
CC: user@spark.apache.org
Subject: RE: RE: Build spark application into uber jar
Multiple maven profiles may be the ideal way
: prajod.vettiyat...@wipro.com
Date: 2015-06-19 14:39
To: user@spark.apache.org
Subject: RE: Build spark application into uber jar
but when I run the application locally, it complains that spark related stuff
is missing
I use the uber jar option. What do you mean by “locally” ? In the Spark scala
shell
]
Sent: 19 June 2015 13:01
To: Prajod S Vettiyattil (WT01 - BAS); Akhil Das
Cc: user
Subject: Re: RE: Build spark application into uber jar
Thanks.
I guess what you mean by maven build target is maven profile. I added two
profiles, one is LocalRun, the other is ClusterRun for the spark related