See <https://builds.apache.org/job/HBase-1.1-JDK7/1644/changes>
Changes: [stack] HBASE-15153 Apply checkFamilies addendum on increment to 1.1 and 1.0 ------------------------------------------ [...truncated 5398 lines...] Run 3: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor.testIncrement[fast=false](org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor) Run 1: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 2: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 3: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor.testIncrement[fast=true](org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor) Run 1: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 2: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 3: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor.testIncrementingInvalidValue[fast=false](org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor) Run 1: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 2: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 3: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor.testIncrementingInvalidValue[fast=true](org.apache.hadoop.hbase.client.TestIncrementFromClientSideWithCoprocessor) Run 1: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 2: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind Run 3: TestIncrementFromClientSideWithCoprocessor.before:47->TestIncrementsFromClientSide.before:101 » Bind org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementInvalidArguments[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementInvalidArguments[fast=true](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementOnSameColumn[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Problem binding to [localhost:0... Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementOnSameColumn[fast=true](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementOutOfOrder[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementOutOfOrder[fast=true](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementReturnValue[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementReturnValue[fast=true](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementWithDeletes[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrement[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrement[fast=true](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementingInvalidValue[fast=false](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestIncrementsFromClientSide.testIncrementingInvalidValue[fast=true](org.apache.hadoop.hbase.client.TestIncrementsFromClientSide) Run 1: TestIncrementsFromClientSide.before:101 » Bind Problem binding to [localhost:0... Run 2: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 Run 3: TestIncrementsFromClientSide.before:101 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestMetaScanner.testConcurrentMetaScannerAndCatalogJanitor(org.apache.hadoop.hbase.client.TestMetaScanner) Run 1: TestMetaScanner.testConcurrentMetaScannerAndCatalogJanitor:119->setUp:58 » Bind Run 2: TestMetaScanner.testConcurrentMetaScannerAndCatalogJanitor:119->setUp:58 » Bind Run 3: TestMetaScanner.testConcurrentMetaScannerAndCatalogJanitor:119->setUp:58 » Bind org.apache.hadoop.hbase.client.TestMetaScanner.testMetaScanner(org.apache.hadoop.hbase.client.TestMetaScanner) Run 1: TestMetaScanner.testMetaScanner:71->setUp:58 » Bind Port in use: localhost:0 Run 2: TestMetaScanner.testMetaScanner:71->setUp:58 » Bind Port in use: localhost:0 Run 3: TestMetaScanner.testMetaScanner:71->setUp:58 » Bind Port in use: localhost:0 org.apache.hadoop.hbase.client.TestMultiParallel.org.apache.hadoop.hbase.client.TestMultiParallel Run 1: TestMultiParallel.beforeClass:76 » Bind Problem binding to [localhost:0] java.... Run 2: TestMultiParallel.afterClass:84 NullPointer TestPutWithDelete.setUpBeforeClass:40 » Bind Problem binding to [localhost:0] ... TestReplicasClient.beforeClass:174 » Bind Port in use: localhost:0 TestShortCircuitConnection.setUpBeforeClass:50 » Bind Problem binding to [loca... TestSizeFailures.setUpBeforeClass:62 » Bind Port in use: localhost:0 TestSnapshotCloneIndependence.setupCluster:70 » Bind Problem binding to [local... org.apache.hadoop.hbase.client.TestTableSnapshotScanner.testWithMultiRegion(org.apache.hadoop.hbase.client.TestTableSnapshotScanner) Run 1: TestTableSnapshotScanner.testWithMultiRegion:119->testScanner:129->setupCluster:59 » Bind Run 2: TestTableSnapshotScanner.testWithMultiRegion:119->testScanner:129->setupCluster:59 » Bind Run 3: TestTableSnapshotScanner.testWithMultiRegion:119->testScanner:129->setupCluster:59 » Bind org.apache.hadoop.hbase.client.TestTableSnapshotScanner.testWithOfflineHBaseMultiRegion(org.apache.hadoop.hbase.client.TestTableSnapshotScanner) Run 1: TestTableSnapshotScanner.testWithOfflineHBaseMultiRegion:124->testScanner:129->setupCluster:59 » Bind Run 2: TestTableSnapshotScanner.testWithOfflineHBaseMultiRegion:124->testScanner:129->setupCluster:59 » Bind Run 3: TestTableSnapshotScanner.testWithOfflineHBaseMultiRegion:124->testScanner:129->setupCluster:59 » Bind org.apache.hadoop.hbase.client.TestTableSnapshotScanner.testWithSingleRegion(org.apache.hadoop.hbase.client.TestTableSnapshotScanner) Run 1: TestTableSnapshotScanner.testWithSingleRegion:114->testScanner:129->setupCluster:59 » Bind Run 2: TestTableSnapshotScanner.testWithSingleRegion:114->testScanner:129->setupCluster:59 » Bind Run 3: TestTableSnapshotScanner.testWithSingleRegion:114->testScanner:129->setupCluster:59 » Bind TestMasterFailoverWithProcedures.testWALfencingWithoutWALRolling:173->testWALfencing:205 » FileNotFound org.apache.hadoop.hbase.util.TestHBaseFsck.testQuarantineMissingRegionDir(org.apache.hadoop.hbase.util.TestHBaseFsck) Run 1: TestHBaseFsck.testQuarantineMissingRegionDir:2231->doQuarantineTest:2143->cleanupTable:475->deleteTable:2811 » TableNotDisabled Run 2: TestHBaseFsck.testQuarantineMissingRegionDir:2231->doQuarantineTest:2143->cleanupTable:475->deleteTable:2811 » TableNotDisabled Run 3: TestHBaseFsck.testQuarantineMissingRegionDir:2231->doQuarantineTest:2143->cleanupTable:475->deleteTable:2811 » TableNotDisabled Flaked tests: org.apache.hadoop.hbase.regionserver.TestSplitTransactionOnCluster.testFailedSplit(org.apache.hadoop.hbase.regionserver.TestSplitTransactionOnCluster) Run 1: TestSplitTransactionOnCluster.testFailedSplit:1339 null Run 2: PASS Tests run: 2415, Failures: 1, Errors: 47, Skipped: 18, Flakes: 1 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache HBase ...................................... SUCCESS [3.065s] [INFO] Apache HBase - Checkstyle ......................... SUCCESS [0.691s] [INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.241s] [INFO] Apache HBase - Annotations ........................ SUCCESS [1.206s] [INFO] Apache HBase - Protocol ........................... SUCCESS [12.856s] [INFO] Apache HBase - Common ............................. SUCCESS [1:23.339s] [INFO] Apache HBase - Procedure .......................... SUCCESS [2:42.621s] [INFO] Apache HBase - Client ............................. SUCCESS [1:24.686s] [INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [7.383s] [INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [6.781s] [INFO] Apache HBase - Prefix Tree ........................ SUCCESS [9.668s] [INFO] Apache HBase - Server ............................. FAILURE [1:25:16.344s] [INFO] Apache HBase - Testing Util ....................... SKIPPED [INFO] Apache HBase - Thrift ............................. SKIPPED [INFO] Apache HBase - Rest ............................... SKIPPED [INFO] Apache HBase - Shell .............................. SKIPPED [INFO] Apache HBase - Integration Tests .................. SKIPPED [INFO] Apache HBase - Examples ........................... SKIPPED [INFO] Apache HBase - Assembly ........................... SKIPPED [INFO] Apache HBase - Shaded ............................. SKIPPED [INFO] Apache HBase - Shaded - Client .................... SKIPPED [INFO] Apache HBase - Shaded - Server .................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:31:31.334s [INFO] Finished at: Fri Jan 22 23:59:32 UTC 2016 [INFO] Final Memory: 53M/544M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (secondPartTestsExecution) on project hbase-server: ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.maven.surefire.report.ReporterException: When writing xml report stdout/stderr: /tmp/stderr4969615412005959232deferred (No such file or directory) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hbase-server Build step 'Invoke top-level Maven targets' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : # Post-build task script. TODO: Check this in and have all builds reference check-in. pwd && ls # NOTE!!!! The below code has been copied and pasted from ./dev-tools/run-test.sh # Do not change here without syncing there and vice-versa. ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l` if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then echo "Suspicious java process found - waiting 30s to see if there are just slow to stop" sleep 30 ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l` if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then echo " {color:red}There appear to be $ZOMBIE_TESTS_COUNT zombie tests{color}, they should have been killed by surefire but survived" jps -v | grep surefirebooter | grep -e '-Dhbase.test' jps -v | grep surefirebooter | grep -e '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack # Exit with error exit 1 else echo "We're ok: there is no zombie test, but some tests took some time to stop" fi else echo "We're ok: there is no zombie test" fi [HBase-1.1-JDK7] $ /bin/bash -xe /tmp/hudson9037438500658910288.sh + pwd <https://builds.apache.org/job/HBase-1.1-JDK7/ws/> + ls bin CHANGES.txt conf dev-support hbase-annotations hbase-assembly hbase-checkstyle hbase-client hbase-common hbase-examples hbase-hadoop2-compat hbase-hadoop-compat hbase-it hbase-native-client hbase-prefix-tree hbase-procedure hbase-protocol hbase-resource-bundle hbase-rest hbase-server hbase-shaded hbase-shell hbase-testing-util hbase-thrift LICENSE.txt NOTICE.txt pom.xml README.txt src target ++ jps -v ++ grep surefirebooter ++ grep -e -Dhbase.test ++ wc -l + ZOMBIE_TESTS_COUNT=0 + [[ 0 != 0 ]] + echo 'We'\''re ok: there is no zombie test' We're ok: there is no zombie test POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Archiving artifacts Recording test results Updating HBASE-15153
