See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/923/
################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 9662 lines...] Running org.apache.hadoop.mapred.pipes.TestPipes Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.03 sec - in org.apache.hadoop.mapred.pipes.TestPipes Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0 Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.872 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0 Running org.apache.hadoop.mapred.TestReporter Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.933 sec - in org.apache.hadoop.mapred.TestReporter Results : Failed tests: TestNetworkedJob.testNetworkedJob:174 expected:<[[Thu Jan 21 17:42:09 +0000 2016] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]> Tests in error: TestJHSSecurity.testDelegationToken:111 ยป YarnRuntime Failed to intialize exis... Tests run: 527, Failures: 1, Errors: 1, Skipped: 11 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop MapReduce Client .................... SUCCESS [ 2.341 s] [INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:42 min] [INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.473 s] [INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 4.323 s] [INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:05 min] [INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min] [INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [ 01:44 h] [INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED [INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] Apache Hadoop MapReduce ........................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 02:03 h [INFO] Finished at: 2016-01-21T18:23:56+00:00 [INFO] Final Memory: 34M/209M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures. [ERROR] [ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-mapreduce-client-jobclient Build step 'Execute shell' marked build as failure [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Recording test results Email was triggered for: Failure - Any Sending email for trigger: Failure - Any ################################################################################### ############################## FAILED TESTS (if any) ############################## 2 tests failed. FAILED: org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob Error Message: expected:<[[Thu Jan 21 17:42:09 +0000 2016] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]> Stack Trace: org.junit.ComparisonFailure: expected:<[[Thu Jan 21 17:42:09 +0000 2016] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:16> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ; ]> but was:<[]> at org.junit.Assert.assertEquals(Assert.java:115) at org.junit.Assert.assertEquals(Assert.java:144) at org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:174) FAILED: org.apache.hadoop.mapreduce.security.TestJHSSecurity.testDelegationToken Error Message: Failed to intialize existing directories Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to intialize existing directories at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:460) at org.apache.hadoop.fs.DelegateToFileSystem.listStatus(DelegateToFileSystem.java:168) at org.apache.hadoop.fs.ChecksumFs.listStatus(ChecksumFs.java:521) at org.apache.hadoop.fs.AbstractFileSystem$1.<init>(AbstractFileSystem.java:890) at org.apache.hadoop.fs.AbstractFileSystem.listStatusIterator(AbstractFileSystem.java:888) at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492) at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1494) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444) at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:809) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:703) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.security.TestJHSSecurity.testDelegationToken(TestJHSSecurity.java:111)