See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/2999/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE 
###########################
[...truncated 43010 lines...]
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.539 sec - in 
org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.094 sec - in 
org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestNetworkedJob.testNetworkedJob:174 expected:<[[Tue Feb 23 12:29:15 +0000 
2016] Scheduler has assigned a container for AM, waiting for AM container to be 
launched]> but was:<[]>

Tests in error: 
  
TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:234
 » 
  
TestUberAM>TestMRJobs.testConfVerificationWithJobClient:326->TestMRJobs.testConfVerification:411
 » 
  
TestUberAM>TestMRJobs.testConfVerificationWithClassloaderCustomClasses:316->TestMRJobs.testConfVerification:365
 » NoClassDefFound
  TestJobOutputCommitter.tearDown:67->HadoopTestCase.tearDown:170 » 
NoClassDefFound
  TestJobOutputCommitter.setUp:59->HadoopTestCase.setUp:157 » YarnRuntime could 
...
  TestJobOutputCommitter.setUp:59->HadoopTestCase.setUp:157 » YarnRuntime could 
...
  TestDataDrivenDBInputFormat.testDateSplits:214 » NoClassDefFound 
org/apache/ha...
  TestMapReduceChain>HadoopTestCase.setUp:146 » Runtime 
org.xml.sax.SAXParseExce...

Tests run: 514, Failures: 1, Errors: 5, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.772 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.848 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.069 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:53 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:10 h
[INFO] Finished at: 2016-02-23T12:36:45+00:00
[INFO] Final Memory: 33M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on 
project hadoop-mapreduce-client-jobclient: There was a timeout or other error 
in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) 
##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.lib.chain.TestMapReduceChain.testChain

Error Message:
org.xml.sax.SAXParseException; systemId: 
jar:file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-common/3.0.0-SNAPSHOT/hadoop-yarn-common-3.0.0-SNAPSHOT.jar!/yarn-default.xml;
 lineNumber: 2157; columnNumber: 15; The element type "description" must be 
terminated by the matching end-tag "</description>".

Stack Trace:
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: 
jar:file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-common/3.0.0-SNAPSHOT/hadoop-yarn-common-3.0.0-SNAPSHOT.jar!/yarn-default.xml;
 lineNumber: 2157; columnNumber: 15; The element type "description" must be 
terminated by the matching end-tag "</description>".
        at 
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2735)
        at 
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2567)
        at 
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2471)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1045)
        at 
org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2090)
        at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:428)
        at 
org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:146)
        at junit.framework.TestCase.runBare(TestCase.java:139)
        at junit.framework.TestResult$1.protect(TestResult.java:122)
        at junit.framework.TestResult.runProtected(TestResult.java:142)
        at junit.framework.TestResult.run(TestResult.java:125)
        at junit.framework.TestCase.run(TestCase.java:129)
        at junit.framework.TestSuite.runTest(TestSuite.java:255)
        at junit.framework.TestSuite.run(TestSuite.java:250)
        at 
org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
        at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.xml.sax.SAXParseException: The element type "description" must 
be terminated by the matching end-tag "</description>".
        at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
        at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2555)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2543)
        at 
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2614)
        at 
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2567)
        at 
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2471)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1045)
        at 
org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2090)
        at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:428)
        at 
org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:146)
        at junit.framework.TestCase.runBare(TestCase.java:139)
        at junit.framework.TestResult$1.protect(TestResult.java:122)
        at junit.framework.TestResult.runProtected(TestResult.java:142)
        at junit.framework.TestResult.run(TestResult.java:125)
        at junit.framework.TestCase.run(TestCase.java:129)
        at junit.framework.TestSuite.runTest(TestSuite.java:255)
        at junit.framework.TestSuite.run(TestSuite.java:250)
        at 
org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
        at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  
org.apache.hadoop.mapreduce.lib.db.TestDataDrivenDBInputFormat.testDateSplits

Error Message:
org/apache/hadoop/yarn/exceptions/YarnRuntimeException

Stack Trace:
java.lang.NoClassDefFoundError: 
org/apache/hadoop/yarn/exceptions/YarnRuntimeException
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:92)
        at 
org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:172)
        at 
org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:788)
        at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244)
        at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341)
        at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1359)
        at 
org.apache.hadoop.mapreduce.lib.db.TestDataDrivenDBInputFormat.testDateSplits(TestDataDrivenDBInputFormat.java:214)


FAILED:  
org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomCleanup

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
        at 
org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
        at 
org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
        at 
org.apache.hadoop.mapred.HadoopTestCase.tearDown(HadoopTestCase.java:170)
        at 
org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:67)


FAILED:  
org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testDefaultCleanupAndAbort

Error Message:
could not cleanup test dir: 
org.apache.hadoop.fs.UnsupportedFileSystemException: 
fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for 
scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test 
dir: org.apache.hadoop.fs.UnsupportedFileSystemException: 
fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for 
scheme: file
        at 
org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
        at 
org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
        at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
        at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743)
        at 
org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
        at 
org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
        at 
org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149)
        at 
org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
        at 
org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
        at 
org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
        at 
org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:157)
        at 
org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:59)


FAILED:  
org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort

Error Message:
could not cleanup test dir: 
org.apache.hadoop.fs.UnsupportedFileSystemException: 
fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for 
scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test 
dir: org.apache.hadoop.fs.UnsupportedFileSystemException: 
fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for 
scheme: file
        at 
org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
        at 
org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
        at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
        at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743)
        at 
org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
        at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
        at 
org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
        at 
org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149)
        at 
org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
        at 
org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
        at 
org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
        at 
org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:157)
        at 
org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:59)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob

Error Message:
expected:<[[Tue Feb 23 12:29:15 +0000 2016] Scheduler has assigned a container 
for AM, waiting for AM container to be launched]> but was:<[]>

Stack Trace:
org.junit.ComparisonFailure: expected:<[[Tue Feb 23 12:29:15 +0000 2016] 
Scheduler has assigned a container for AM, waiting for AM container to be 
launched]> but was:<[]>
        at org.junit.Assert.assertEquals(Assert.java:115)
        at org.junit.Assert.assertEquals(Assert.java:144)
        at 
org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:174)


Reply via email to