[ 
https://issues.apache.org/jira/browse/HADOOP-9991?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13822050#comment-13822050
 ] 

stack commented on HADOOP-9991:
-------------------------------

[[email protected]] Thanks for taking on this masochistic task.  Given hadoop 
surface area, it is hard to figure if a dependency is needed.  How about we go 
radical in trunk since it will be ten years before there is a hadoop3 which 
should be time enough to figure those dependencies removed that should have 
been left in.  hadoop2 could do w/ a just a purge of at least the 
just-not-used.  I'd like to help out.  In hbase we put on a few filters to try 
and block the plain farcical but god-bless-maven, you have to reproduce this 
set each time for each hadoop version we build against since no xinclude... 

> Fix up Hadoop Poms for enforced dependencies, roll up JARs to latest versions
> -----------------------------------------------------------------------------
>
>                 Key: HADOOP-9991
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9991
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: build
>    Affects Versions: 2.3.0, 2.1.1-beta
>            Reporter: Steve Loughran
>         Attachments: hadoop-9991-v1.txt
>
>
> If you try using Hadoop downstream with a classpath shared with HBase and 
> Accumulo, you soon discover how messy the dependencies are.
> Hadoop's side of this problem is
> # not being up to date with some of the external releases of common JARs
> # not locking down/excluding inconsistent versions of artifacts provided down 
> the dependency graph



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Reply via email to