[ 
https://issues.apache.org/jira/browse/HADOOP-14284?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16126187#comment-16126187
 ] 

stack commented on HADOOP-14284:
--------------------------------

If it helps, here is what was done over in hbase to make it so we could upgrade 
guava, netty, protobuf, etc., w/o damage to downstreamers or having to use 
whatever hadoop et al. happened to have on the CLASSPATH

 * We made a little project hbase-thirdparty. Its only charge is providing 
mainline hbase with relocated popular libs such as guava and netty. The project 
comprises nought but poms (caveat some hacky patching of protobuf our project 
requires): https://github.com/apache/hbase-thirdparty The pull and relocation 
is moved out of the mainline hbase build.
 * We changed mainline hbase to use relocated versions of popular libs. This 
was mostly a case of changing imports from, for example, 
com.google.protobuf.Message to 
org.apache.hadoop.hbase.shaded.com.google.protobuf.Message (an unfortunate 
decision a while back saddled us w/ the extra-long relocation prefix).
 * As part of the mainline build, we run com.google.code.maven-replacer-plugin 
to rewrite third-party references in generated code to instead refer to our 
relocated versions.

Upside is we can update core libs whenever we wish. Should a lib turn 
problematic, we can add it to the relocated set. Downside is having to be sure 
we always refer to the relocated versions in code.

While the pattern is straight-forward, the above project took a good while to 
implement mostly because infra is a bit shakey and our test suite has a host of 
flakies in it; verifying the test was failiing because it a flakey and not 
because of the relocation took a good while.

If you want to do similar project in hadoop, I'd be game to help out.

> Shade Guava everywhere
> ----------------------
>
>                 Key: HADOOP-14284
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14284
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 3.0.0-alpha4
>            Reporter: Andrew Wang
>            Assignee: Tsuyoshi Ozawa
>            Priority: Blocker
>         Attachments: HADOOP-14238.pre001.patch, HADOOP-14284.002.patch, 
> HADOOP-14284.004.patch, HADOOP-14284.007.patch, HADOOP-14284.010.patch, 
> HADOOP-14284.012.patch
>
>
> HADOOP-10101 upgraded the guava version for 3.x to 21.
> Guava is broadly used by Java projects that consume our artifacts. 
> Unfortunately, these projects also consume our private artifacts like 
> {{hadoop-hdfs}}. They also are unlikely on the new shaded client introduced 
> by HADOOP-11804, currently only available in 3.0.0-alpha2.
> We should shade Guava everywhere to proactively avoid breaking downstreams. 
> This isn't a requirement for all dependency upgrades, but it's necessary for 
> known-bad dependencies like Guava.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to