[ 
https://issues.apache.org/jira/browse/ACCUMULO-3100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123188#comment-14123188
 ] 

Christopher Tubbs commented on ACCUMULO-3100:
---------------------------------------------

Perhaps a quick fix would be to suggest a patch to Hadoop to fix their use of 
deprecated and beta code? As the Guava docs suggest, LimitInputStream is 
deprecated and beta in the version that Hadoop is using. In fact, 
LimitInputStream has always been marked beta, and Hadoop should probably have 
avoided it from the beginning, certainly dropped its use immediately after it 
was deprecated.

We can't resolve all dependency management issues, but we can address this 
issue by submitting a patch to Hadoop that replaces LimitInputStream with 
ByteStreams.limit(), as the deprecation doc suggests.

I don't know that it makes sense for us to go backwards to an older version 
after we've already released with a dependency on a newer one.

> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>
>                 Key: ACCUMULO-3100
>                 URL: https://issues.apache.org/jira/browse/ACCUMULO-3100
>             Project: Accumulo
>          Issue Type: Bug
>          Components: test
>    Affects Versions: 1.6.0
>            Reporter: Josh Elser
>            Assignee: Josh Elser
>             Fix For: 1.6.1, 1.7.0
>
>
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle 
> support that was recently added to branch-2 (specifically HDFS-6134 and 
> HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:380)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
>       at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1312)
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream 
> because we depend on Guava 15.0 which doesn't contain LimitInputStream.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to