[
https://issues.apache.org/jira/browse/HBASE-18304?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16112938#comment-16112938
]
Tamas Penzes commented on HBASE-18304:
--------------------------------------
Hi [~busbey],
Please see my comments inline.
> 1042 <exclude>com.google.protobuf:protobuf-java</exclude>
> This is going to be a nightmare due to our purposeful handling of multiple
> versions. But maybe I'm misunderstanding it, since shouldn't our internal use
> of protobuf 3 be masked since we relocate it in third-party-deps?
We do only reference protobuf 3.3.0 in hbase-protocol-shaded now, but it is a
dependency of hbase-client, hbase-procedure and hbase-server. Through the
transitive dependencies it causes conflict in this three module.
If I exclude protobuf from the dependency hbase-protocol-shaded in these three
modules, it looks okay. Is it?
> 1043 <exclude>org.slf4j:slf4j-log4j12</exclude>
> This one should be easy to just set to latest.
If I add org.slf4j:slf4j-log4j12:${slf4j.version} to hbase-client as dependency
it solves the problem.
> 1044 <exclude>com.google.guava:guava</exclude>
> Maybe solved for us by our move to third-party-deps? Shouldn't only Hadoop's
> show up? or is the conflict in spark or some such? (questions for the
> eventual follow-on JIRA)
Almost solved. org.tachyonproject:tachyon-client uses guava 14.0.1, and is
referenced directly and transitively from org.apache.spark:spark-core_2.10.
Otherwise we only use guava version 11.0.2. If I can exclude it from
spark-core_2.10 transitive dependencies in hbase-spark and hbase-spark-it it
works.
> 1045 <exclude>com.thoughtworks.paranamer:paranamer</exclude>
> 1046 <exclude>commons-net:commons-net</exclude>
> 1047 <exclude>net.java.dev.jets3t:jets3t</exclude>
> These should go okay.
Go okay as being excluded from the check or if I add them to hbase-spark and
hbase-spark-it as direct dependency?
> 1048 <exclude>org.scala-lang:scala-library</exclude>
> 1049 <exclude>org.scala-lang:scala-reflect</exclude>
> These are probably just an error in our spark module. Best not to try to
> address it until we close out HBASE-16179
Okay. They stay excluded from the check.
> 1050 <exclude>io.netty:netty</exclude>
> I think also solved by our move to third-party-deps on HBASE-18271
Just as with guava org.apache.spark:spark-core_2.10 causes the problem. It uses
netty version 3.8.0.Final as transitive dependency while we use 3.6.2.Final
everywhere else.
Should I exclude it from spark-core's dependencies manually?
Thanks, Tamaas
> Start enforcing upperbounds on dependencies
> -------------------------------------------
>
> Key: HBASE-18304
> URL: https://issues.apache.org/jira/browse/HBASE-18304
> Project: HBase
> Issue Type: Task
> Components: build, dependencies
> Affects Versions: 2.0.0
> Reporter: Sean Busbey
> Assignee: Tamas Penzes
> Labels: beginner
> Fix For: 2.0.0
>
> Attachments: HBASE-18304.master.001.patch,
> HBASE-18304.master.002.patch, HBASE-18304.master.002.patch
>
>
> would be nice to get this going before our next major version.
> http://maven.apache.org/enforcer/enforcer-rules/requireUpperBoundDeps.html
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)