< Running org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat < Running org.apache.hadoop.hbase.mapreduce.TestImportExport < Running org.apache.hadoop.hbase.mapreduce.TestImportTsv < Running org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat < Running org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1 < Running org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2 < Running org.apache.hadoop.hbase.mapreduce.TestTableMapReduce < Running org.apache.hadoop.hbase.regionserver.TestJoinedScanners < Running org.apache.hadoop.hbase.TestZooKeeper
On Tue, May 7, 2013 at 1:16 PM, <[email protected]> wrote: > See <http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/224/changes> > > Changes: > > [Michael Stack] HBASE-8483 HConnectionManager can leak ZooKeeper > connections when using deleteStaleConnection > > [sershe] HBASE-8272 make compaction checker frequency configurable per > table/cf; ADDENDUM > > ------------------------------------------ > [...truncated 7589 lines...] > at > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) > at java.lang.Thread.run(Thread.java:662) > No results for java.util.concurrent.FutureTask@7deebece > No results for java.util.concurrent.FutureTask@453106f9 > org.apache.maven.surefire.booter.SurefireBooterForkException: Error > occurred in starting fork, check output in log > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:238) > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter.access$000(ForkStarter.java:64) > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter$ParallelFork.call(ForkStarter.java:303) > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter$ParallelFork.call(ForkStarter.java:285) > at > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > at > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) > at java.lang.Thread.run(Thread.java:662) > No results for java.util.concurrent.FutureTask@60add02a > Running org.apache.hadoop.hbase.TestInfoServers > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.451 sec > org.apache.maven.surefire.booter.SurefireBooterForkException: Error > occurred in starting fork, check output in log > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:238) > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter.access$000(ForkStarter.java:64) > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter$ParallelFork.call(ForkStarter.java:303) > at > org.apache.maven.plugin.surefire.booterclient.ForkStarter$ParallelFork.call(ForkStarter.java:285) > at > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > at > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) > at java.lang.Thread.run(Thread.java:662) > Running org.apache.hadoop.hbase.thrift2.TestThriftHBaseServiceHandler > Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.898 sec > Running org.apache.hadoop.hbase.constraint.TestConstraint > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.712 sec > Running org.apache.hadoop.hbase.rest.TestTableResource > Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.755 sec > Running org.apache.hadoop.hbase.mapreduce.TestRowCounter > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 173.522 sec > Running org.apache.hadoop.hbase.rest.TestGzipFilter > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.194 sec > Running org.apache.hadoop.hbase.rest.TestSchemaResource > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.685 sec > Running org.apache.hadoop.hbase.rest.TestStatusResource > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.668 sec > Running org.apache.hadoop.hbase.mapreduce.TestImportTsv > Tests run: 5, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 314.353 > sec <<< FAILURE! > No results for java.util.concurrent.FutureTask@1047e39d > Running org.apache.hadoop.hbase.rest.TestVersionResource > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.986 sec > Running org.apache.hadoop.hbase.rest.TestRowResource > Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.442 sec > Running org.apache.hadoop.hbase.rest.TestScannerResource > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 62.947 sec > Running org.apache.hadoop.hbase.rest.TestScannersWithFilters > Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.757 sec > Running org.apache.hadoop.hbase.rest.TestMultiRowResource > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.536 sec > Running org.apache.hadoop.hbase.rest.client.TestRemoteAdmin > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.457 sec > Running org.apache.hadoop.hbase.rest.client.TestRemoteTable > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.971 sec > Running > org.apache.hadoop.hbase.backup.example.TestZooKeeperTableArchiveClient > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.876 sec > Running org.apache.hadoop.hbase.TestHBaseTestingUtility > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.751 sec > Running org.apache.hadoop.hbase.client.TestMultipleTimestamps > Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.275 sec > Running org.apache.hadoop.hbase.backup.TestHFileArchiving > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 110.665 sec > Running org.apache.hadoop.hbase.client.TestMultiParallel > Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 144.108 > sec > Running org.apache.hadoop.hbase.client.replication.TestReplicationAdmin > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.698 sec > Running org.apache.hadoop.hbase.client.TestFromClientSide3 > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 179.203 sec > Running org.apache.hadoop.hbase.client.TestHCM > Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.217 sec > Running org.apache.hadoop.hbase.client.TestTimestampsFilter > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.723 sec > Running org.apache.hadoop.hbase.client.TestRestoreSnapshotFromClient > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 202.605 sec > Running org.apache.hadoop.hbase.client.TestSnapshotFromClient > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.307 sec > Running org.apache.hadoop.hbase.client.TestClientScannerRPCTimeout > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.971 sec > Running org.apache.hadoop.hbase.client.TestHTableUtil > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.173 sec > Running org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 721.593 sec > Running org.apache.hadoop.hbase.client.TestScannersFromClientSide > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.511 sec > Running org.apache.hadoop.hbase.client.TestSnapshotCloneIndependence > Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.693 sec > Running org.apache.hadoop.hbase.client.TestScannerTimeout > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.193 sec > Running org.apache.hadoop.hbase.client.TestHTableMultiplexer > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.495 sec > Running org.apache.hadoop.hbase.client.TestCloneSnapshotFromClient > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.978 sec > Running org.apache.hadoop.hbase.client.TestMetaScanner > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.561 sec > Running org.apache.hadoop.hbase.client.TestSnapshotMetadata > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.872 sec > Running > org.apache.hadoop.hbase.client.TestHTablePool$TestHTableReusablePool > Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.414 sec > Running > org.apache.hadoop.hbase.client.TestHTablePool$TestHTableThreadLocalPool > Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.754 sec > Running org.apache.hadoop.hbase.client.TestFromClientSide > Tests run: 61, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 161.801 > sec > Running org.apache.hadoop.hbase.client.TestClientTimeouts > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.052 sec > Running org.apache.hadoop.hbase.catalog.TestMetaMigrationConvertingToPB > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.197 sec > Running org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor > Tests run: 61, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 152.582 > sec > Running org.apache.hadoop.hbase.catalog.TestMetaReaderEditor > Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.728 sec > Running org.apache.hadoop.hbase.catalog.TestCatalogTracker > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.736 sec > Running org.apache.hadoop.hbase.ipc.TestHBaseClient > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.432 sec > Running org.apache.hadoop.hbase.catalog.TestMetaReaderEditorNoCluster > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.514 sec > Running org.apache.hadoop.hbase.ipc.TestProtoBufRpc > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.525 sec > Running org.apache.hadoop.hbase.client.TestShell > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 76.052 sec > Running org.apache.hadoop.hbase.ipc.TestDelayedRpc > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.931 sec > Running org.apache.hadoop.hbase.thrift.TestThriftServerCmdLine > Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.983 sec > Running org.apache.hadoop.hbase.thrift.TestThriftServer > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.399 sec > Running org.apache.hadoop.hbase.client.TestAdmin > Tests run: 43, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 680.034 > sec > > Results : > > Failed tests: > testSimpleCase(org.apache.hadoop.hbase.mapreduce.TestImportExport) > testMetaExport(org.apache.hadoop.hbase.mapreduce.TestImportExport) > testWithFilter(org.apache.hadoop.hbase.mapreduce.TestImportExport) > testWithDeletes(org.apache.hadoop.hbase.mapreduce.TestImportExport) > testMROnTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): > expected:<0> but was:<1> > > Tests run: 1475, Failures: 5, Errors: 0, Skipped: 14 > > [JENKINS] Recording test results > [INFO] > [INFO] > ------------------------------------------------------------------------ > [INFO] Skipping HBase > [INFO] This project has been banned from the build due to previous > failures. > [INFO] > ------------------------------------------------------------------------ > [INFO] > [INFO] > ------------------------------------------------------------------------ > [INFO] Skipping HBase > [INFO] This project has been banned from the build due to previous > failures. > [INFO] > ------------------------------------------------------------------------ > [INFO] > ------------------------------------------------------------------------ > [INFO] Reactor Summary: > [INFO] > [INFO] HBase ............................................. SUCCESS > [52.741s] > [INFO] HBase - Common .................................... SUCCESS > [1:41.732s] > [INFO] HBase - Protocol .................................. SUCCESS > [17.964s] > [INFO] HBase - Client .................................... SUCCESS > [12.599s] > [INFO] HBase - Prefix Tree ............................... SUCCESS [5.181s] > [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [1.377s] > [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS > [16.913s] > [INFO] HBase - Server .................................... FAILURE > [51:32.034s] > [INFO] HBase - Integration Tests ......................... SKIPPED > [INFO] HBase - Examples .................................. SKIPPED > [INFO] HBase - Assembly .................................. SKIPPED > [INFO] > ------------------------------------------------------------------------ > [INFO] BUILD FAILURE > [INFO] > ------------------------------------------------------------------------ > [INFO] Total time: 55:10.024s > [INFO] Finished at: Tue May 07 20:16:06 UTC 2013 > [INFO] Final Memory: 126M/787M > [INFO] > ------------------------------------------------------------------------ > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop2-compat/pom.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop2-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop2-compat/0.97.0-SNAPSHOT/hbase-hadoop2-compat-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop2-compat/target/hbase-hadoop2-compat-0.97.0-SNAPSHOT.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop2-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop2-compat/0.97.0-SNAPSHOT/hbase-hadoop2-compat-0.97.0-SNAPSHOT.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop2-compat/target/hbase-hadoop2-compat-0.97.0-SNAPSHOT-sources.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop2-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop2-compat/0.97.0-SNAPSHOT/hbase-hadoop2-compat-0.97.0-SNAPSHOT-sources.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop2-compat/target/hbase-hadoop2-compat-0.97.0-SNAPSHOT-tests.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop2-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop2-compat/0.97.0-SNAPSHOT/hbase-hadoop2-compat-0.97.0-SNAPSHOT-tests.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-common/pom.xml> to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-common/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-common/0.97.0-SNAPSHOT/hbase-common-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-common/target/hbase-common-0.97.0-SNAPSHOT.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-common/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-common/0.97.0-SNAPSHOT/hbase-common-0.97.0-SNAPSHOT.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-common/target/hbase-common-0.97.0-SNAPSHOT-sources.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-common/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-common/0.97.0-SNAPSHOT/hbase-common-0.97.0-SNAPSHOT-sources.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-common/target/hbase-common-0.97.0-SNAPSHOT-tests.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-common/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-common/0.97.0-SNAPSHOT/hbase-common-0.97.0-SNAPSHOT-tests.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop-compat/pom.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop-compat/0.97.0-SNAPSHOT/hbase-hadoop-compat-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop-compat/target/hbase-hadoop-compat-0.97.0-SNAPSHOT.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop-compat/0.97.0-SNAPSHOT/hbase-hadoop-compat-0.97.0-SNAPSHOT.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop-compat/target/hbase-hadoop-compat-0.97.0-SNAPSHOT-sources.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop-compat/0.97.0-SNAPSHOT/hbase-hadoop-compat-0.97.0-SNAPSHOT-sources.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-hadoop-compat/target/hbase-hadoop-compat-0.97.0-SNAPSHOT-tests.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-hadoop-compat/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-hadoop-compat/0.97.0-SNAPSHOT/hbase-hadoop-compat-0.97.0-SNAPSHOT-tests.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-examples/pom.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-examples/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-examples/0.97.0-SNAPSHOT/hbase-examples-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-it/pom.xml> to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-it/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-it/0.97.0-SNAPSHOT/hbase-it-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/pom.xml> to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase/0.97.0-SNAPSHOT/hbase-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/target/hbase-0.97.0-SNAPSHOT-site.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase/0.97.0-SNAPSHOT/hbase-0.97.0-SNAPSHOT-site.xml > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-assembly/pom.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-assembly/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-assembly/0.97.0-SNAPSHOT/hbase-assembly-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-client/pom.xml> to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-client/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-client/0.97.0-SNAPSHOT/hbase-client-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-client/target/hbase-client-0.97.0-SNAPSHOT.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-client/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-client/0.97.0-SNAPSHOT/hbase-client-0.97.0-SNAPSHOT.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-client/target/hbase-client-0.97.0-SNAPSHOT-sources.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-client/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-client/0.97.0-SNAPSHOT/hbase-client-0.97.0-SNAPSHOT-sources.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-client/target/hbase-client-0.97.0-SNAPSHOT-tests.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-client/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-client/0.97.0-SNAPSHOT/hbase-client-0.97.0-SNAPSHOT-tests.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-prefix-tree/pom.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-prefix-tree/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-prefix-tree/0.97.0-SNAPSHOT/hbase-prefix-tree-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-prefix-tree/target/hbase-prefix-tree-0.97.0-SNAPSHOT.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-prefix-tree/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-prefix-tree/0.97.0-SNAPSHOT/hbase-prefix-tree-0.97.0-SNAPSHOT.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-prefix-tree/target/hbase-prefix-tree-0.97.0-SNAPSHOT-sources.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-prefix-tree/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-prefix-tree/0.97.0-SNAPSHOT/hbase-prefix-tree-0.97.0-SNAPSHOT-sources.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-prefix-tree/target/hbase-prefix-tree-0.97.0-SNAPSHOT-tests.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-prefix-tree/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-prefix-tree/0.97.0-SNAPSHOT/hbase-prefix-tree-0.97.0-SNAPSHOT-tests.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-protocol/pom.xml> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-protocol/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-protocol/0.97.0-SNAPSHOT/hbase-protocol-0.97.0-SNAPSHOT.pom > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-protocol/target/hbase-protocol-0.97.0-SNAPSHOT.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-protocol/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-protocol/0.97.0-SNAPSHOT/hbase-protocol-0.97.0-SNAPSHOT.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-protocol/target/hbase-protocol-0.97.0-SNAPSHOT-sources.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-protocol/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-protocol/0.97.0-SNAPSHOT/hbase-protocol-0.97.0-SNAPSHOT-sources.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-protocol/target/hbase-protocol-0.97.0-SNAPSHOT-tests.jar> > to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-protocol/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-protocol/0.97.0-SNAPSHOT/hbase-protocol-0.97.0-SNAPSHOT-tests.jar > [JENKINS] Archiving < > http://54.241.6.143/job/HBase-TRUNK-Hadoop-2/ws/hbase-server/pom.xml> to > /var/lib/jenkins/jobs/HBase-TRUNK-Hadoop-2/modules/org.apache.hbase$hbase-server/builds/2013-05-07_19-20-49/archive/org.apache.hbase/hbase-server/0.97.0-SNAPSHOT/hbase-server-0.97.0-SNAPSHOT.pom > Waiting for Jenkins to finish collecting data > mavenExecutionResult exceptions not empty > message : Failed to execute goal > org.apache.maven.plugins:maven-surefire-plugin:2.12-TRUNK-HBASE-2:test > (secondPartTestsExecution) on project hbase-server: Failure or timeout > cause : Failure or timeout > Stack trace : > org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute > goal org.apache.maven.plugins:maven-surefire-plugin:2.12-TRUNK-HBASE-2:test > (secondPartTestsExecution) on project hbase-server: Failure or timeout > at > org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217) > at > org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) > at > org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) > at > org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) > at > org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) > at > org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) > at > org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) > at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) > at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) > at > org.jvnet.hudson.maven3.launcher.Maven3Launcher.main(Maven3Launcher.java:79) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:329) > at > org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:239) > at > org.jvnet.hudson.maven3.agent.Maven3Main.launch(Maven3Main.java:158) > at hudson.maven.Maven3Builder.call(Maven3Builder.java:100) > at hudson.maven.Maven3Builder.call(Maven3Builder.java:66) > at hudson.remoting.UserRequest.perform(UserRequest.java:118) > at hudson.remoting.UserRequest.perform(UserRequest.java:48) > at hudson.remoting.Request$2.run(Request.java:326) > at > hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72) > at > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > at > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) > at java.lang.Thread.run(Thread.java:662) > Caused by: org.apache.maven.plugin.MojoExecutionException: Failure or > timeout > at > org.apache.maven.plugin.surefire.SurefirePlugin.assertNoFailureOrTimeout(SurefirePlugin.java:643) > at > org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary(SurefirePlugin.java:624) > at > org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:137) > at > org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:98) > at > org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) > at > org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) > ... 27 more > channel stopped > -- // Jonathan Hsieh (shay) // Software Engineer, Cloudera // [email protected]
