[ 
https://issues.apache.org/jira/browse/ACCUMULO-3100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123682#comment-14123682
 ] 

Josh Elser commented on ACCUMULO-3100:
--------------------------------------

bq. Doing this right will require ACCUMULO-1483, right?

That's half of it. The other half would be making sure that user iterators on 
the server are also happy.

bq. if that doesn't happen in time we document that Accumulo 1.6 only runs on 
up to Hadoop 2.5.0?

I'd be -1 for this. Like you said, we don't make guarantees on what 
dependencies we provide version to version. I still think the best immediate 
solution is to downgrade to 14.0.1 and work to get Hadoop to a more recent 
version a quicker timeframe as well as a better user/server space isolation 
story. If users want to substitute their own version of Guava, they can still 
do so, but it's tagged as a YMMV. We advertise dependencies that actually work 
across the breadth of (stable) Hadoop-2 and users who want newer versions of 
dependencies can provide them on their own.

> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>
>                 Key: ACCUMULO-3100
>                 URL: https://issues.apache.org/jira/browse/ACCUMULO-3100
>             Project: Accumulo
>          Issue Type: Bug
>          Components: test
>    Affects Versions: 1.6.0
>            Reporter: Josh Elser
>            Assignee: Josh Elser
>             Fix For: 1.6.1, 1.7.0
>
>
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle 
> support that was recently added to branch-2 (specifically HDFS-6134 and 
> HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:380)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
>       at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1312)
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream 
> because we depend on Guava 15.0 which doesn't contain LimitInputStream.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to