See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1522/
################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 31896 lines...] Running org.apache.hadoop.mapreduce.lib.partition.TestInputSampler Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.782 sec Running org.apache.hadoop.mapreduce.lib.partition.TestTotalOrderPartitioner Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.077 sec Running org.apache.hadoop.mapreduce.lib.partition.TestMRKeyFieldBasedComparator Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.146 sec Running org.apache.hadoop.mapreduce.lib.partition.TestKeyFieldHelper Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.137 sec Running org.apache.hadoop.mapreduce.lib.partition.TestBinaryPartitioner Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.479 sec Running org.apache.hadoop.mapreduce.lib.partition.TestMRKeyFieldBasedPartitioner Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.417 sec Running org.apache.hadoop.mapreduce.TestChild Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 62.18 sec Running org.apache.hadoop.mapreduce.filecache.TestURIFragments Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.064 sec Running org.apache.hadoop.mapreduce.TestMapReduce Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.423 sec Results : Tests run: 466, Failures: 0, Errors: 0, Skipped: 11 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] hadoop-mapreduce-client ........................... SUCCESS [1.761s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [38.324s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [25.257s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [2.286s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [5:45.840s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [1:36.681s] [INFO] hadoop-mapreduce-client-jobclient ................. FAILURE [1:20:40.094s] [INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] hadoop-mapreduce .................................. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:29:10.881s [INFO] Finished at: Sun Aug 18 14:47:40 UTC 2013 [INFO] Final Memory: 40M/96M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ? -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-mapreduce-client-jobclient Build step 'Execute shell' marked build as failure [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Updating HDFS-5104 Email was triggered for: Failure Sending email for trigger: Failure ################################################################################### ############################## FAILED TESTS (if any) ############################## No tests ran.