See <https://builds.apache.org/job/HBase-1.5/jdk=JDK_1_8,label=Hadoop&&!H13/24/display/redirect?page=changes>
Changes: [tedyu] HBASE-18614 Setting BUCKET_CACHE_COMBINED_KEY to false disables stats on ------------------------------------------ [...truncated 2.48 MB...] Run 1: TestSecureLoadIncrementalHFilesSplitRecovery>TestLoadIncrementalHFilesSplitRecovery.testGroupOrSplitWhenRegionHoleExistsInMeta:502->TestLoadIncrementalHFilesSplitRecovery.setupTableWithSplitkeys:159->Object.wait:-2 » TestTimedOut Run 2: PASS org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFilesSplitRecovery.testSplitTmpFileCleanUp(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFilesSplitRecovery) Run 1: TestSecureLoadIncrementalHFilesSplitRecovery>TestLoadIncrementalHFilesSplitRecovery.testSplitTmpFileCleanUp:431->TestLoadIncrementalHFilesSplitRecovery.setupTableWithSplitkeys:159->Object.wait:-2 » TestTimedOut Run 2: PASS org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFilesSplitRecovery.testSplitWhileBulkLoadPhase(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFilesSplitRecovery) Run 1: TestSecureLoadIncrementalHFilesSplitRecovery>TestLoadIncrementalHFilesSplitRecovery.testSplitWhileBulkLoadPhase:346->TestLoadIncrementalHFilesSplitRecovery.setupTable:136->Object.wait:-2 » TestTimedOut Run 2: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationForFlushAndCompaction(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationForFlushAndCompaction:306 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationForFlushAndCompaction:306 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDisabledTables(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDisabledTables:332->testRegionReplicaReplicationIgnoresDisabledTables:351 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDisabledTables:332->testRegionReplicaReplicationIgnoresDisabledTables:351 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDroppedTables(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDroppedTables:337->testRegionReplicaReplicationIgnoresDisabledTables:350 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDroppedTables:337->testRegionReplicaReplicationIgnoresDisabledTables:350 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreated(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreated:121 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreated:121 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable:152 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable:152 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith10Replicas(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith10Replicas:261->testRegionReplicaReplication:180 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith10Replicas:261->testRegionReplicaReplication:180 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith2Replicas(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith2Replicas:251->testRegionReplicaReplication:180 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith2Replicas:251->testRegionReplicaReplication:180 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith3Replicas(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith3Replicas:256->testRegionReplicaReplication:180 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith3Replicas:256->testRegionReplicaReplication:180 » RetriesExhausted Run 3: PASS org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaWithoutMemstoreReplication(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint) Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaWithoutMemstoreReplication:271 » RetriesExhausted Run 2: TestRegionReplicaReplicationEndpoint.testRegionReplicaWithoutMemstoreReplication:271 » RetriesExhausted Run 3: PASS Tests run: 2849, Failures: 0, Errors: 1, Skipped: 57, Flakes: 20 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache HBase ...................................... SUCCESS [1:44.470s] [INFO] Apache HBase - Checkstyle ......................... SUCCESS [15.496s] [INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.227s] [INFO] Apache HBase - Annotations ........................ SUCCESS [1.667s] [INFO] Apache HBase - Protocol ........................... SUCCESS [17.432s] [INFO] Apache HBase - Common ............................. SUCCESS [2:38.861s] [INFO] Apache HBase - Procedure .......................... SUCCESS [3:35.325s] [INFO] Apache HBase - Metrics API ........................ SUCCESS [1.287s] [INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [11.294s] [INFO] Apache HBase - Metrics Implementation ............. SUCCESS [5.280s] [INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [18.739s] [INFO] Apache HBase - Client ............................. SUCCESS [1:47.840s] [INFO] Apache HBase - Prefix Tree ........................ SUCCESS [8.714s] [INFO] Apache HBase - Server ............................. FAILURE [1:46:07.139s] [INFO] Apache HBase - Testing Util ....................... SKIPPED [INFO] Apache HBase - Thrift ............................. SKIPPED [INFO] Apache HBase - Rest ............................... SKIPPED [INFO] Apache HBase - Shell .............................. SKIPPED [INFO] Apache HBase - Integration Tests .................. SKIPPED [INFO] Apache HBase - Examples ........................... SKIPPED [INFO] Apache HBase - External Block Cache ............... SKIPPED [INFO] Apache HBase - Assembly ........................... SKIPPED [INFO] Apache HBase - Shaded ............................. SKIPPED [INFO] Apache HBase - Shaded - Client .................... SKIPPED [INFO] Apache HBase - Shaded - Server .................... SKIPPED [INFO] Apache HBase - Archetypes ......................... SKIPPED [INFO] Apache HBase - Exemplar for hbase-client archetype SKIPPED [INFO] Apache HBase - Exemplar for hbase-shaded-client archetype SKIPPED [INFO] Apache HBase - Archetype builder .................. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:57:45.643s [INFO] Finished at: Wed Aug 23 19:52:00 UTC 2017 [INFO] Final Memory: 103M/1807M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (secondPartTestsExecution) on project hbase-server: There was a timeout or other error in the fork -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hbase-server Build step 'Invoke top-level Maven targets' marked build as failure Performing Post build task... Match found for :.* : True Logical operation result is TRUE Running script : # Post-build task script. TODO: Check this in and have all builds reference check-in. pwd && ls # NOTE!!!! The below code has been copied and pasted from ./dev-tools/run-test.sh # Do not change here without syncing there and vice-versa. ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l` if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then echo "Suspicious java process found - waiting 30s to see if there are just slow to stop" sleep 30 ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l` if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then echo " {color:red}There appear to be $ZOMBIE_TESTS_COUNT zombie tests{color}, they should have been killed by surefire but survived" jps -v | grep surefirebooter | grep -e '-Dhbase.test' jps -v | grep surefirebooter | grep -e '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack # Exit with error exit 1 else echo "We're ok: there is no zombie test, but some tests took some time to stop" fi else echo "We're ok: there is no zombie test" fi Setting JDK_1_7_LATEST__HOME=/home/jenkins/tools/java/latest1.7 Setting JDK_1_8_LATEST__HOME=/home/jenkins/tools/java/latest1.8 [30b16b4c] $ /bin/bash -xe /tmp/jenkins601265808701995463.sh + pwd <https://builds.apache.org/job/HBase-1.5/jdk=JDK_1_8,label=Hadoop&&!H13/ws/> + ls bin CHANGES.txt conf dev-support extra_env_var hbase-annotations hbase-archetypes hbase-assembly hbase-checkstyle hbase-client hbase-common hbase-examples hbase-external-blockcache hbase-hadoop2-compat hbase-hadoop-compat hbase-it hbase-metrics hbase-metrics-api hbase-native-client hbase-prefix-tree hbase-procedure hbase-protocol hbase-resource-bundle hbase-rest hbase-server hbase-shaded hbase-shell hbase-testing-util hbase-thrift LICENSE.txt NOTICE.txt pom.xml README.txt src target ++ jps -v ++ grep surefirebooter ++ wc -l ++ grep -e -Dhbase.test + ZOMBIE_TESTS_COUNT=0 + [[ 0 != 0 ]] + echo 'We'\''re ok: there is no zombie test' We're ok: there is no zombie test POST BUILD TASK : SUCCESS END OF POST BUILD TASK : 0 Archiving artifacts Setting JDK_1_7_LATEST__HOME=/home/jenkins/tools/java/latest1.7 Setting JDK_1_8_LATEST__HOME=/home/jenkins/tools/java/latest1.8 [Fast Archiver] No prior successful build to compare, so performing full copy of artifacts Recording test results Setting JDK_1_7_LATEST__HOME=/home/jenkins/tools/java/latest1.7 Setting JDK_1_8_LATEST__HOME=/home/jenkins/tools/java/latest1.8 ERROR: Step ?Publish JUnit test result report? aborted due to exception: java.lang.OutOfMemoryError: Java heap space at com.sun.org.apache.xerces.internal.util.XMLStringBuffer.append(XMLStringBuffer.java:208) at com.sun.org.apache.xerces.internal.impl.XMLEntityScanner.scanData(XMLEntityScanner.java:1515) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanCDATASection(XMLDocumentFragmentScannerImpl.java:1654) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3014) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:602) at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:112) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:505) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:841) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:770) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141) at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213) at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643) at org.dom4j.io.SAXReader.read(SAXReader.java:465) at org.dom4j.io.SAXReader.read(SAXReader.java:343) at hudson.tasks.junit.SuiteResult.parse(SuiteResult.java:132) at hudson.tasks.junit.TestResult.parse(TestResult.java:302) at hudson.tasks.junit.TestResult.parsePossiblyEmpty(TestResult.java:244) at hudson.tasks.junit.TestResult.parse(TestResult.java:175) at hudson.tasks.junit.TestResult.parse(TestResult.java:154) at hudson.tasks.junit.TestResult.<init>(TestResult.java:126) at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:132) at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:107) at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2739) at hudson.remoting.UserRequest.perform(UserRequest.java:153) at hudson.remoting.UserRequest.perform(UserRequest.java:50) at hudson.remoting.Request$2.run(Request.java:336) at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:748) at ......remote call to H2(Native Method) at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1545) at hudson.remoting.UserResponse.retrieve(UserRequest.java:253) at hudson.remoting.Channel.call(Channel.java:830) Caused: java.io.IOException: Remote call on H2 failed at hudson.remoting.Channel.call(Channel.java:838) at hudson.FilePath.act(FilePath.java:986) Caused: java.io.IOException: remote file operation failed: <https://builds.apache.org/job/HBase-1.5/jdk=JDK_1_8,label=Hadoop&&!H13/ws/> at hudson.remoting.Channel@5017dde5:H2 at hudson.FilePath.act(FilePath.java:993) at hudson.FilePath.act(FilePath.java:975) at hudson.tasks.junit.JUnitParser.parseResult(JUnitParser.java:103) at hudson.tasks.junit.JUnitResultArchiver.parse(JUnitResultArchiver.java:128) at hudson.tasks.junit.JUnitResultArchiver.perform(JUnitResultArchiver.java:149) at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:81) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:735) at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:676) at hudson.model.Build$BuildExecution.post2(Build.java:186) at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:621) at hudson.model.Run.execute(Run.java:1760) at hudson.matrix.MatrixRun.run(MatrixRun.java:146) at hudson.model.ResourceController.execute(ResourceController.java:97) at hudson.model.Executor.run(Executor.java:405) Setting JDK_1_7_LATEST__HOME=/home/jenkins/tools/java/latest1.7 Setting JDK_1_8_LATEST__HOME=/home/jenkins/tools/java/latest1.8
