[ 
https://issues.apache.org/jira/browse/SPARK-4491?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-4491.
------------------------------
    Resolution: Won't Fix

I don't see a reason here that indicates the Spark build should change its 
behavior. I do agree that this build and its deps are hairy, and harmonizing 
dependencies within Spark and with an app is often a deep rabbit hole.

> Using sbt assembly with spark as dep requires Phd in sbt
> --------------------------------------------------------
>
>                 Key: SPARK-4491
>                 URL: https://issues.apache.org/jira/browse/SPARK-4491
>             Project: Spark
>          Issue Type: Question
>            Reporter: sam
>
> I get the dreaded deduplicate error from sbt.  I resolved the issue (I think, 
> I managed to run the SimpleApp example) here 
> http://stackoverflow.com/a/27018691/1586965
> My question is, is this wise? What is wrong with changing the `deduplicate` 
> bit to `first`.  Why isn't it this by default?
> If this isn't the way to make it work, please could someone provide an 
> explanation of the correct way with .sbt examples. Having googled, every 
> example I see is different because it changes depending on what deps the 
> person has ... surely there has to be an automagic way of doing it (if my way 
> isn't it)?
> One final point, SBT seems to be blaming Spark for causing the problem in 
> their documentation: https://github.com/sbt/sbt-assembly is this fair? Is 
> Spark doing something wrong in the way they build their jars? Or should SBT 
> be renamed to CBT (Complicated Build Tool that will make you need Cognitive 
> Behavioural Therapy after use).
> NOTE: Satire JFF, really I love both SBT & Spark :)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to