[ 
https://issues.apache.org/jira/browse/SPARK-14956?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15260161#comment-15260161
 ] 

Michel Lemay commented on SPARK-14956:
--------------------------------------

Of course, this must be painful as well for Spark maintainers to be locked on 
specific versions of old libraries.  

Shading and relocating should be fine most of the time but not always. For 
instance, it's not possible to use logback framework when running in Spark. See 
my post on 
[SO|http://stackoverflow.com/questions/31790944/best-way-to-send-apache-spark-loggin-to-redis-logstash-on-an-amazon-emr-cluster].

Anyway, I'm not advocating Felix/OSGi but just seeking discussion on how to 
solve the issue.

> Spark dependencies conflicts
> ----------------------------
>
>                 Key: SPARK-14956
>                 URL: https://issues.apache.org/jira/browse/SPARK-14956
>             Project: Spark
>          Issue Type: Brainstorming
>          Components: Spark Core
>    Affects Versions: 1.6.1
>            Reporter: Michel Lemay
>            Priority: Minor
>              Labels: dependencies
>
> Since Spark lives with ages-old dependencies, it's all too often a problem 
> that we must downgrade one of our dependencies just to make it "not explode" 
> in Spark. And most importantly, this is something we only encounter at 
> runtime and that makes it an even worse problem.
> So the usual solution when we depend directly on a package (but not one of 
> our transitive dependencies) is to relocate with a shade plugin like this:
> {code}
> <plugin>
>   <groupId>org.apache.maven.plugins</groupId>
>   <artifactId>maven-shade-plugin</artifactId>
> ...
>   <configuration>
>     <relocations>
>       <relocation>
>         <pattern>scopt</pattern>
>         <shadedPattern>hidden.scopt</shadedPattern>
>       </relocation>
> {code}
> Other times, we must exclude transitive dependencies like that:
> {code}
>         <dependency>
>             <groupId>com.twitter.penguin</groupId>
>             <artifactId>korean-text</artifactId>
>             <version>4.1.2</version>
>             <exclusions>
>                 <exclusion>
>                     <groupId>org.slf4j</groupId>
>                     <artifactId>slf4j-nop</artifactId>
>                 </exclusion>
>             </exclusions>
>         </dependency>
> {code}
> Everything related to Guava, Log4j, Databind, scopt and even Java8 fall into 
> this category.
> I wonder if that would be possible to use OSGi style of plugins when running 
> code inside Spark..  That would shield us from all theses things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to