See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3016/
################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 31882 lines...] Running org.apache.hadoop.mapred.pipes.TestPipeApplication Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.967 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication Running org.apache.hadoop.mapred.TestJavaSerialization Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.514 sec - in org.apache.hadoop.mapred.TestJavaSerialization Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.557 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath Running org.apache.hadoop.ipc.TestMRCJCSocketFactory Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.299 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory Results : Failed tests: TestNetworkedJob.testNetworkedJob:174 expected:<[[Fri Feb 26 22:11:40 +0000 2016] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]> Tests in error: TestMRCredentials.setUp:66 » YarnRuntime java.io.IOException: ResourceManager ... TestJHSSecurity.testDelegationToken:225 » NoClassDefFound org/apache/hadoop/se... TestEncryptedShuffle.encryptedShuffleWithClientCerts:167->encryptedShuffleWithCerts:138->startCluster:107 » YarnRuntime TestEncryptedShuffle.encryptedShuffleWithoutClientCerts:172->encryptedShuffleWithCerts:138->startCluster:107 » YarnRuntime Tests run: 526, Failures: 1, Errors: 4, Skipped: 11 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop MapReduce Client .................... SUCCESS [ 2.890 s] [INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min] [INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.672 s] [INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 5.437 s] [INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min] [INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:35 min] [INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [ 01:56 h] [INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED [INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] Apache Hadoop MapReduce ........................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 02:14 h [INFO] Finished at: 2016-02-26T22:54:24+00:00 [INFO] Final Memory: 37M/606M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-mapreduce-client-jobclient Build step 'Execute shell' marked build as failure [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Recording test results Email was triggered for: Failure - Any Sending email for trigger: Failure - Any ################################################################################### ############################## FAILED TESTS (if any) ############################## 5 tests failed. FAILED: org.apache.hadoop.mapreduce.security.TestJHSSecurity.testDelegationToken Error Message: org/apache/hadoop/service/ServiceOperations Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157) at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208) at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221) at org.apache.hadoop.mapreduce.security.TestJHSSecurity.testDelegationToken(TestJHSSecurity.java:225) FAILED: org.apache.hadoop.mapreduce.security.TestMRCredentials.org.apache.hadoop.mapreduce.security.TestMRCredentials Error Message: java.io.IOException: ResourceManager failed to start. Final state is STOPPED Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: ResourceManager failed to start. Final state is STOPPED at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:332) at org.apache.hadoop.yarn.server.MiniYARNCluster.access$400(MiniYARNCluster.java:100) at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:458) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120) at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80) at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41) at org.apache.hadoop.mapreduce.security.TestMRCredentials.setUp(TestMRCredentials.java:66) FAILED: org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithClientCerts Error Message: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2 Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2 at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:180) at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492) at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1487) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:456) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:811) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:705) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120) at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80) at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41) at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.startCluster(TestEncryptedShuffle.java:107) at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithCerts(TestEncryptedShuffle.java:138) at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithClientCerts(TestEncryptedShuffle.java:167) FAILED: org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithoutClientCerts Error Message: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2 Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2 at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:180) at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492) at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1487) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:456) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:811) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:705) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120) at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80) at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41) at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.startCluster(TestEncryptedShuffle.java:107) at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithCerts(TestEncryptedShuffle.java:138) at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithoutClientCerts(TestEncryptedShuffle.java:172) FAILED: org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob Error Message: expected:<[[Fri Feb 26 22:11:40 +0000 2016] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]> Stack Trace: org.junit.ComparisonFailure: expected:<[[Fri Feb 26 22:11:40 +0000 2016] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]> at org.junit.Assert.assertEquals(Assert.java:115) at org.junit.Assert.assertEquals(Assert.java:144) at org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:174)