Hi Efe,

Are you talking about creating an uber/fat jar file for your specific
application? Then you can distribute it to another node just to use the jar
file without assembling it.

I can still do it in Spark 2 as before if I understand your special use
case.

[warn] Strategy 'discard' was applied to 349 files
[warn] Strategy 'first' was applied to 450 files

*[info] Assembly up to date:
/data6/hduser/scala/CEP_streaming/target/scala-2.10/scala-assembly-1.0.jar*[success]
Total time: 117 s, completed Aug 10, 2016 9:31:24 PM
Submiting the job
Ivy Default Cache set to: /home/hduser/.ivy2/cache
The jars for the packages stored in: /home/hduser/.ivy2/jars



HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 10 August 2016 at 20:35, Holden Karau <hol...@pigscanfly.ca> wrote:

> What are you looking to use the assembly jar for - maybe we can think of a
> workaround :)
>
>
> On Wednesday, August 10, 2016, Efe Selcuk <efema...@gmail.com> wrote:
>
>> Sorry, I should have specified that I'm specifically looking for that fat
>> assembly behavior. Is it no longer possible?
>>
>> On Wed, Aug 10, 2016 at 10:46 AM, Nick Pentreath <
>> nick.pentre...@gmail.com> wrote:
>>
>>> You're correct - Spark packaging has been shifted to not use the
>>> assembly jar.
>>>
>>> To build now use "build/sbt package"
>>>
>>>
>>>
>>> On Wed, 10 Aug 2016 at 19:40, Efe Selcuk <efema...@gmail.com> wrote:
>>>
>>>> Hi Spark folks,
>>>>
>>>> With Spark 1.6 the 'assembly' target for sbt would build a fat jar with
>>>> all of the main Spark dependencies for building an application. Against
>>>> Spark 2, that target is no longer building a spark assembly, just ones for
>>>> e.g. Flume and Kafka.
>>>>
>>>> I'm not well versed with maven and sbt, so I don't know how to go about
>>>> figuring this out.
>>>>
>>>> Is this intended? Or am I missing something?
>>>>
>>>> Thanks.
>>>>
>>>
>>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
>
>

Reply via email to