See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/2423/
################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 33033 lines...] Results : Tests in error: TestRecovery.testOutputRecovery:862->writeOutput:1877 NoClassDefFound org/apac... TestRecovery.testOutputRecoveryMapsOnly:930 » YarnRuntime could not cleanup te... TestRecovery.testRecoverySuccessUsingCustomOutputCommitter:459 » YarnRuntime c... TestRecovery.testCrashed:130 » YarnRuntime could not cleanup test dir TestRecovery.testSpeculative:1164 » YarnRuntime could not cleanup test dir TestRecovery.testRecoveryWithoutShuffleSecret:1316 » YarnRuntime could not cle... TestRecovery.testRecoveryWithOldCommiter:1045 » YarnRuntime could not cleanup ... TestAMWebServicesJobConf.testJobConf:172->verifyAMJobConf:237 NoClassDefFound ... TestAMWebServicesJobConf.testJobConfXML:229->verifyAMJobConfXML:257 NoClassDefFound TestAMWebServicesJobConf.testJobConfSlash:190->verifyAMJobConf:237 NoClassDefFound TestAMWebServicesJobConf.testJobConfDefault:207->verifyAMJobConf:237 NoClassDefFound Tests run: 338, Failures: 0, Errors: 11, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop MapReduce Client .................... SUCCESS [ 2.894 s] [INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min] [INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.128 s] [INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 4.930 s] [INFO] Apache Hadoop MapReduce App ....................... FAILURE [08:42 min] [INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED [INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED [INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED [INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] Apache Hadoop MapReduce ........................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 10:56 min [INFO] Finished at: 2015-10-05T09:10:16+00:00 [INFO] Final Memory: 40M/825M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures. [ERROR] [ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-mapreduce-client-app Build step 'Execute shell' marked build as failure [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Recording test results Updating HDFS-9151 Email was triggered for: Failure - Any Sending email for trigger: Failure - Any ################################################################################### ############################## FAILED TESTS (if any) ############################## 11 tests failed. FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecovery Error Message: org/apache/hadoop/io/NullWritable Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/io/NullWritable at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.writeOutput(TestRecovery.java:1877) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecovery(TestRecovery.java:862) FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecoveryMapsOnly Error Message: could not cleanup test dir Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:239) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:208) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:189) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppWithHistory.<init>(TestRecovery.java:1928) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecoveryMapsOnly(TestRecovery.java:930) FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoverySuccessUsingCustomOutputCommitter Error Message: could not cleanup test dir Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:239) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:208) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:189) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppWithHistory.<init>(TestRecovery.java:1928) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoverySuccessUsingCustomOutputCommitter(TestRecovery.java:459) FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed Error Message: could not cleanup test dir Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:239) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:208) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:189) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppWithHistory.<init>(TestRecovery.java:1928) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:130) FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative Error Message: could not cleanup test dir Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:239) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:208) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:189) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppWithHistory.<init>(TestRecovery.java:1928) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative(TestRecovery.java:1164) FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithoutShuffleSecret Error Message: could not cleanup test dir Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:239) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:208) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:189) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppWithHistory.<init>(TestRecovery.java:1928) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppNoShuffleSecret.<init>(TestRecovery.java:1960) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithoutShuffleSecret(TestRecovery.java:1316) FAILED: org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithOldCommiter Error Message: could not cleanup test dir Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:239) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:208) at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:189) at org.apache.hadoop.mapreduce.v2.app.TestRecovery$MRAppWithHistory.<init>(TestRecovery.java:1928) at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithOldCommiter(TestRecovery.java:1045) FAILED: org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf Error Message: org/apache/hadoop/yarn/webapp/WebServicesTestUtils Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/webapp/WebServicesTestUtils at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.verifyAMJobConf(TestAMWebServicesJobConf.java:237) at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf(TestAMWebServicesJobConf.java:172) FAILED: org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML Error Message: org/apache/hadoop/yarn/webapp/WebServicesTestUtils Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/webapp/WebServicesTestUtils at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.verifyAMJobConfXML(TestAMWebServicesJobConf.java:257) at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML(TestAMWebServicesJobConf.java:229) FAILED: org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash Error Message: org/apache/hadoop/yarn/webapp/WebServicesTestUtils Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/webapp/WebServicesTestUtils at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.verifyAMJobConf(TestAMWebServicesJobConf.java:237) at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash(TestAMWebServicesJobConf.java:190) FAILED: org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault Error Message: org/apache/hadoop/yarn/webapp/WebServicesTestUtils Stack Trace: java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/webapp/WebServicesTestUtils at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.verifyAMJobConf(TestAMWebServicesJobConf.java:237) at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault(TestAMWebServicesJobConf.java:207)
