[
https://issues.apache.org/jira/browse/SPARK-2848?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14086670#comment-14086670
]
Marcelo Vanzin commented on SPARK-2848:
---------------------------------------
Question for others ([~pwendell], [~sowen], maybe others): how important do you
think it is to support this from the sbt side of the build?
This is trivial to do on the maven side (just a few pom file changes). But I
can't seem to find any sbt plugin that does class relocation like
maven-shade-plugin. I could write the code, but that seems to go in the wrong
direction of keeping the sbt build code small-ish.
> Shade Guava in Spark deliverables
> ---------------------------------
>
> Key: SPARK-2848
> URL: https://issues.apache.org/jira/browse/SPARK-2848
> Project: Spark
> Issue Type: Sub-task
> Components: Spark Core
> Reporter: Marcelo Vanzin
> Assignee: Marcelo Vanzin
>
> As discussed in SPARK-2420, this task covers the work of shading Guava in
> Spark deliverables so that they don't conflict with the Hadoop classpath (nor
> user's classpath).
> Since one Guava class is exposed through Spark's API, that class will be
> forked from 14.0.1 (current version used by Spark) and excluded from any
> shading.
> The end result is that Spark's Guava won't be exposed to users anymore. This
> has the side-effect of effectively downgrading to version 11 (the one used by
> Hadoop) for those that do not explicitly depend on / package Guava with their
> apps.
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]