See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/881/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE 
###########################
[...truncated 10650 lines...]
Running org.apache.hadoop.mapreduce.v2.TestUberAM
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 91.123 sec <<< 
FAILURE!
Running org.apache.hadoop.mapreduce.TestYarnClientProtocolProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.625 sec

Results :

Tests in error: 
  
testJobHistoryData(org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService):
 Failed to Start org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
  org.apache.hadoop.mapreduce.v2.TestMROldApiJobs: Failed to Start 
org.apache.hadoop.mapreduce.v2.TestMROldApiJobs
  org.apache.hadoop.mapreduce.v2.TestUberAM: Failed to Start 
org.apache.hadoop.mapreduce.v2.TestMRJobs

Tests run: 14, Failures: 0, Errors: 3, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] hadoop-yarn ....................................... SUCCESS [4.763s]
[INFO] hadoop-yarn-api ................................... SUCCESS [7.911s]
[INFO] hadoop-yarn-common ................................ SUCCESS [27.666s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.097s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [9.507s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [1:13.311s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [1.888s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:21.718s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [12.171s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.059s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [15.323s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.144s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.185s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [13.778s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [9.201s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [2.335s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [2:20.066s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [19.240s]
[INFO] hadoop-mapreduce-client-jobclient ................. FAILURE [20:08.469s]
[INFO] hadoop-mapreduce .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27:08.373s
[INFO] Finished at: Sun Oct 30 00:42:47 UTC 2011
[INFO] Final Memory: 65M/772M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.9:test (default-test) on 
project hadoop-mapreduce-client-jobclient: Failure or timeout -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) 
##############################
No tests ran.

Reply via email to