[
https://issues.apache.org/jira/browse/ACCUMULO-3100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123157#comment-14123157
]
Josh Elser commented on ACCUMULO-3100:
--------------------------------------
bq. The issue with MAC was that the maven plugins were using a fork of guava,
hence the ordering issues.
No, that's a completely separate issue about some sisu-plugins repackaging
Guava classes. ACCUMULO-2714, linked above, is purely because we depend on 15.0
and Hadoop-2 on 14.0.1.
bq. If we pull back the version, we break compatibility for all users who may
be relying on guava features introduced after whatever version hadoop is using
We're also right now playing with fire for any new version of Hadoop. Like I
said, the only reason this hasn't bit us in production yet is because the
Hadoop client APIs don't use them directly. Right now, it's extremely confusing
and misleading because Accumulo-1.6.0 with Guava 15.0 (which is implied to work
given our dependencies) does not work with anything >=Hadoop-2.4.0. This is
misleading to clients.
bq. our previous guava version that DID affect us
Please inform us what the actual bugs were as I already stated that I didn't
see any between 14.0.1 and 15.0 that were likely to affect our usage of Guava.
> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>
> Key: ACCUMULO-3100
> URL: https://issues.apache.org/jira/browse/ACCUMULO-3100
> Project: Accumulo
> Issue Type: Bug
> Components: test
> Affects Versions: 1.6.0
> Reporter: Josh Elser
> Assignee: Josh Elser
> Fix For: 1.6.1, 1.7.0
>
>
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle
> support that was recently added to branch-2 (specifically HDFS-6134 and
> HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:380)
> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1312)
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream
> because we depend on Guava 15.0 which doesn't contain LimitInputStream.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)