Hi,
What about...libraryDependencies in build.sbt with % Provided +
sbt-assembly + sbt assembly = DONE.
Not much has changed since.
Jacek
On 11 Aug 2016 11:29 a.m., "Efe Selcuk" wrote:
> Bump!
>
> On Wed, Aug 10, 2016 at 2:59 PM, Efe Selcuk wrote:
>
Bump!
On Wed, Aug 10, 2016 at 2:59 PM, Efe Selcuk wrote:
> Thanks for the replies, folks.
>
> My specific use case is maybe unusual. I'm working in the context of the
> build environment in my company. Spark was being used in such a way that
> the fat assembly jar that the
Thanks for the replies, folks.
My specific use case is maybe unusual. I'm working in the context of the
build environment in my company. Spark was being used in such a way that
the fat assembly jar that the old 'sbt assembly' command outputs was used
when building a spark applicaiton. I'm trying
Hi Efe,
Are you talking about creating an uber/fat jar file for your specific
application? Then you can distribute it to another node just to use the jar
file without assembling it.
I can still do it in Spark 2 as before if I understand your special use
case.
[warn] Strategy 'discard' was
How bout all dependencies? Presumably they will all go in --jars ?
What if I have 10 dependencies? Any best practices in packaging apps for
spark 2.0?
Kr
On 10 Aug 2016 6:46 pm, "Nick Pentreath" wrote:
> You're correct - Spark packaging has been shifted to not use the
What are you looking to use the assembly jar for - maybe we can think of a
workaround :)
On Wednesday, August 10, 2016, Efe Selcuk wrote:
> Sorry, I should have specified that I'm specifically looking for that fat
> assembly behavior. Is it no longer possible?
>
> On Wed,
You're correct - Spark packaging has been shifted to not use the assembly
jar.
To build now use "build/sbt package"
On Wed, 10 Aug 2016 at 19:40, Efe Selcuk wrote:
> Hi Spark folks,
>
> With Spark 1.6 the 'assembly' target for sbt would build a fat jar with
> all of the
Sorry, I should have specified that I'm specifically looking for that fat
assembly behavior. Is it no longer possible?
On Wed, Aug 10, 2016 at 10:46 AM, Nick Pentreath
wrote:
> You're correct - Spark packaging has been shifted to not use the assembly
> jar.
>
> To
Hi Spark folks,
With Spark 1.6 the 'assembly' target for sbt would build a fat jar with all
of the main Spark dependencies for building an application. Against Spark
2, that target is no longer building a spark assembly, just ones for e.g.
Flume and Kafka.
I'm not well versed with maven and sbt,