Thanks for the replies, folks.

My specific use case is maybe unusual. I'm working in the context of the
build environment in my company. Spark was being used in such a way that
the fat assembly jar that the old 'sbt assembly' command outputs was used
when building a spark applicaiton. I'm trying to figure out if I can just
use the many library jars instead, but in the meantime I'm hoping to get a
fat assembly in the old way to get us unblocked in updating our application
to use 2.0. It's a proprietary build system, not maven or sbt, so it's not
straightforward and the dependencies are modeled differently.

To be a bit more clear: the fat assembly was not used for any reason other
than to get the spark application to build. This is in the context of
running in Amazon EMR, so we don't send that spark assembly over for runs.

Efe

On Wed, Aug 10, 2016 at 2:15 PM, Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Hi Efe,
>
> Are you talking about creating an uber/fat jar file for your specific
> application? Then you can distribute it to another node just to use the jar
> file without assembling it.
>
> I can still do it in Spark 2 as before if I understand your special use
> case.
>
> [warn] Strategy 'discard' was applied to 349 files
> [warn] Strategy 'first' was applied to 450 files
>
> *[info] Assembly up to date:
> /data6/hduser/scala/CEP_streaming/target/scala-2.10/scala-assembly-1.0.jar*[success]
> Total time: 117 s, completed Aug 10, 2016 9:31:24 PM
> Submiting the job
> Ivy Default Cache set to: /home/hduser/.ivy2/cache
> The jars for the packages stored in: /home/hduser/.ivy2/jars
>
>
>
> HTH
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 10 August 2016 at 20:35, Holden Karau <hol...@pigscanfly.ca> wrote:
>
>> What are you looking to use the assembly jar for - maybe we can think of
>> a workaround :)
>>
>>
>> On Wednesday, August 10, 2016, Efe Selcuk <efema...@gmail.com> wrote:
>>
>>> Sorry, I should have specified that I'm specifically looking for that
>>> fat assembly behavior. Is it no longer possible?
>>>
>>> On Wed, Aug 10, 2016 at 10:46 AM, Nick Pentreath <
>>> nick.pentre...@gmail.com> wrote:
>>>
>>>> You're correct - Spark packaging has been shifted to not use the
>>>> assembly jar.
>>>>
>>>> To build now use "build/sbt package"
>>>>
>>>>
>>>>
>>>> On Wed, 10 Aug 2016 at 19:40, Efe Selcuk <efema...@gmail.com> wrote:
>>>>
>>>>> Hi Spark folks,
>>>>>
>>>>> With Spark 1.6 the 'assembly' target for sbt would build a fat jar
>>>>> with all of the main Spark dependencies for building an application.
>>>>> Against Spark 2, that target is no longer building a spark assembly, just
>>>>> ones for e.g. Flume and Kafka.
>>>>>
>>>>> I'm not well versed with maven and sbt, so I don't know how to go
>>>>> about figuring this out.
>>>>>
>>>>> Is this intended? Or am I missing something?
>>>>>
>>>>> Thanks.
>>>>>
>>>>
>>>
>>
>> --
>> Cell : 425-233-8271
>> Twitter: https://twitter.com/holdenkarau
>>
>>
>

Reply via email to