See <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/4217/>
------------------------------------------ [...truncated 3 lines...] [INFO] ------------------------------------------------------------------------ Downloading: http://repository.apache.org/snapshots/org/apache/james/apache-james-mailbox-hbase/0.6-SNAPSHOT/maven-metadata.xml Downloaded: http://repository.apache.org/snapshots/org/apache/james/apache-james-mailbox-hbase/0.6-SNAPSHOT/maven-metadata.xml (2 KB at 1.4 KB/sec) Downloading: http://repository.apache.org/snapshots/org/apache/james/apache-james-mailbox-hbase/0.6-SNAPSHOT/apache-james-mailbox-hbase-0.6-20140131.041056-264.pom Downloaded: http://repository.apache.org/snapshots/org/apache/james/apache-james-mailbox-hbase/0.6-SNAPSHOT/apache-james-mailbox-hbase-0.6-20140131.041056-264.pom (6 KB at 31.6 KB/sec) Downloading: http://repository.apache.org/snapshots/org/apache/james/apache-james-mailbox-hbase/0.6-SNAPSHOT/apache-james-mailbox-hbase-0.6-20140131.041056-264-tests.jar Downloaded: http://repository.apache.org/snapshots/org/apache/james/apache-james-mailbox-hbase/0.6-SNAPSHOT/apache-james-mailbox-hbase-0.6-20140131.041056-264-tests.jar (42 KB at 144.1 KB/sec) [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ james-server-data-hbase --- [INFO] Deleting <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target> [INFO] [INFO] --- maven-remote-resources-plugin:1.4:process (default) @ james-server-data-hbase --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ james-server-data-hbase --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/src/main/resources> [INFO] Copying 3 resources [INFO] [INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ james-server-data-hbase --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 7 source files to <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/classes> [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ james-server-data-hbase --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ james-server-data-hbase --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 4 source files to <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/test-classes> [WARNING] <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/src/test/java/org/apache/james/system/hbase/TablePoolTest.java>: <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/src/test/java/org/apache/james/system/hbase/TablePoolTest.java> uses or overrides a deprecated API. [WARNING] <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/src/test/java/org/apache/james/system/hbase/TablePoolTest.java>: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.13:test (default-test) @ james-server-data-hbase --- [INFO] Surefire report directory: <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/surefire-reports> ------------------------------------------------------- T E S T S ------------------------------------------------------- ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.apache.james.rrt.hbase.HBaseRecipientRewriteTableTest SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-log4j12/1.7.2/slf4j-log4j12-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-simple/1.7.2/slf4j-simple-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2014-02-01 00:31:46,727 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. 2014-02-01 00:31:54,286 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:31:54,315 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:31:54,329 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. Starting DataNode 0 with dfs.data.dir: <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/test-data/baecb05a-23de-40fa-ab37-ea6a11f5120c/dfscluster_0fc4f656-f09c-46bc-8f52-c1e7e52d768a/dfs/data/data1,/home/jenkins/jenkins-slave/workspace/james-server-trunk/trunk/data/data-hbase/target/test-data/baecb05a-23de-40fa-ab37-ea6a11f5120c/dfscluster_0fc4f656-f09c-46bc-8f52-c1e7e52d768a/dfs/data/data2> 2014-02-01 00:31:55,860 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - NameNode metrics system already initialized! 2014-02-01 00:31:55,860 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:31:56,848 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name jvm already exists! Cluster is active 2014-02-01 00:32:00,168 [org.apache.hadoop.hdfs.server.datanode.DataXceiver@256f8834] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2014-02-01 00:32:07,387 [RegionServer:0;vesta.apache.org,48767,1391214719942] ERROR org.apache.hadoop.hbase.regionserver.HRegionServer - Failed init java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:174) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:139) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77) at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) at org.apache.hadoop.http.HttpServer.start(HttpServer.java:593) at org.apache.hadoop.hbase.regionserver.HRegionServer.putUpWebUI(HRegionServer.java:1417) at org.apache.hadoop.hbase.regionserver.HRegionServer.startServiceThreads(HRegionServer.java:1383) at org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResponse(HRegionServer.java:926) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.handleReportForDutyResponse(MiniHBaseCluster.java:110) at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:639) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.runRegionServer(MiniHBaseCluster.java:136) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.access$000(MiniHBaseCluster.java:89) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer$1.run(MiniHBaseCluster.java:120) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:357) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1118) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:622) at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37) at org.apache.hadoop.hbase.security.User.call(User.java:586) at org.apache.hadoop.hbase.security.User.access$700(User.java:50) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:426) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.run(MiniHBaseCluster.java:118) at java.lang.Thread.run(Thread.java:701) 2014-02-01 00:32:07,391 [RegionServer:0;vesta.apache.org,48767,1391214719942] FATAL org.apache.hadoop.hbase.regionserver.HRegionServer - ABORTING region server vesta.apache.org,48767,1391214719942: Unhandled exception: Address already in use java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:174) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:139) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77) at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) at org.apache.hadoop.http.HttpServer.start(HttpServer.java:593) at org.apache.hadoop.hbase.regionserver.HRegionServer.putUpWebUI(HRegionServer.java:1417) at org.apache.hadoop.hbase.regionserver.HRegionServer.startServiceThreads(HRegionServer.java:1383) at org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResponse(HRegionServer.java:926) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.handleReportForDutyResponse(MiniHBaseCluster.java:110) at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:639) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.runRegionServer(MiniHBaseCluster.java:136) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.access$000(MiniHBaseCluster.java:89) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer$1.run(MiniHBaseCluster.java:120) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:357) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1118) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:622) at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37) at org.apache.hadoop.hbase.security.User.call(User.java:586) at org.apache.hadoop.hbase.security.User.access$700(User.java:50) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:426) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.run(MiniHBaseCluster.java:118) at java.lang.Thread.run(Thread.java:701) 2014-02-01 00:32:07,391 [RegionServer:0;vesta.apache.org,48767,1391214719942] FATAL org.apache.hadoop.hbase.regionserver.HRegionServer - RegionServer abort: loaded coprocessors are: [] 2014-02-01 00:32:07,393 [IPC Server handler 4 on 56307] ERROR org.apache.hadoop.hbase.master.HMaster - Region server vesta.apache.org,48767,1391214719942 reported a fatal error: ABORTING region server vesta.apache.org,48767,1391214719942: Unhandled exception: Address already in use Cause: java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:174) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:139) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77) at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) at org.apache.hadoop.http.HttpServer.start(HttpServer.java:593) at org.apache.hadoop.hbase.regionserver.HRegionServer.putUpWebUI(HRegionServer.java:1417) at org.apache.hadoop.hbase.regionserver.HRegionServer.startServiceThreads(HRegionServer.java:1383) at org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResponse(HRegionServer.java:926) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.handleReportForDutyResponse(MiniHBaseCluster.java:110) at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:639) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.runRegionServer(MiniHBaseCluster.java:136) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.access$000(MiniHBaseCluster.java:89) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer$1.run(MiniHBaseCluster.java:120) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:357) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1118) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:622) at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37) at org.apache.hadoop.hbase.security.User.call(User.java:586) at org.apache.hadoop.hbase.security.User.access$700(User.java:50) at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:426) at org.apache.hadoop.hbase.MiniHBaseCluster$MiniHBaseClusterRegionServer.run(MiniHBaseCluster.java:118) at java.lang.Thread.run(Thread.java:701) Tests run: 2, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 615.961 sec <<< FAILURE! org.apache.james.rrt.hbase.HBaseRecipientRewriteTableTest Time elapsed: 0.019 sec <<< ERROR! java.lang.NoClassDefFoundError: Could not initialize class org.apache.james.rrt.hbase.HBaseRecipientRewriteTableTest at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:190) at org.apache.maven.surefire.report.SmartStackTraceParser.getClass(SmartStackTraceParser.java:63) at org.apache.maven.surefire.report.SmartStackTraceParser.<init>(SmartStackTraceParser.java:53) at org.apache.maven.surefire.common.junit4.JUnit4StackTraceWriter.smartTrimmedStackTrace(JUnit4StackTraceWriter.java:72) at org.apache.maven.surefire.booter.ForkingRunListener.encode(ForkingRunListener.java:328) at org.apache.maven.surefire.booter.ForkingRunListener.encode(ForkingRunListener.java:312) at org.apache.maven.surefire.booter.ForkingRunListener.toString(ForkingRunListener.java:258) at org.apache.maven.surefire.booter.ForkingRunListener.testError(ForkingRunListener.java:131) at org.apache.maven.surefire.common.junit4.JUnit4RunListener.testFailure(JUnit4RunListener.java:111) at org.junit.runner.notification.RunNotifier$4.notifyListener(RunNotifier.java:139) at org.junit.runner.notification.RunNotifier$SafeNotifier.run(RunNotifier.java:61) at org.junit.runner.notification.RunNotifier.fireTestFailures(RunNotifier.java:134) at org.junit.runner.notification.RunNotifier.fireTestFailure(RunNotifier.java:128) at org.junit.internal.runners.model.EachTestNotifier.addFailure(EachTestNotifier.java:23) at org.junit.runners.ParentRunner.run(ParentRunner.java:315) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) Running org.apache.james.domainlist.hbase.HBaseDomainListTest SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-log4j12/1.7.2/slf4j-log4j12-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-simple/1.7.2/slf4j-simple-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2014-02-01 00:42:02,805 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. 2014-02-01 00:42:03,502 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:42:03,529 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:42:03,541 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. Starting DataNode 0 with dfs.data.dir: <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/test-data/f1f9d371-a553-494b-a8ce-7e6926cf635c/dfscluster_9f237474-ca20-4c58-b14c-403c9528ac68/dfs/data/data1,/home/jenkins/jenkins-slave/workspace/james-server-trunk/trunk/data/data-hbase/target/test-data/f1f9d371-a553-494b-a8ce-7e6926cf635c/dfscluster_9f237474-ca20-4c58-b14c-403c9528ac68/dfs/data/data2> 2014-02-01 00:42:04,331 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - NameNode metrics system already initialized! 2014-02-01 00:42:04,332 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:42:05,046 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name jvm already exists! Cluster is active 2014-02-01 00:42:17,790 [org.apache.hadoop.hdfs.server.datanode.DataXceiver@388a2006] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2014-02-01 00:42:26,819 [Master:0;vesta.apache.org,44794,1391215337154] WARN org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper - Node /hbase/root-region-server already deleted, and this is not a retry 2014-02-01 00:42:29,109 [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:21818] WARN org.apache.zookeeper.server.NIOServerCnxn - caught end of stream exception EndOfStreamException: Unable to read additional data from client sessionid 0x143eae55a2e0002, likely client has closed socket at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:220) at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:224) at java.lang.Thread.run(Thread.java:701) 2014-02-01 00:42:33,733 [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:21818] WARN org.apache.zookeeper.server.NIOServerCnxn - caught end of stream exception EndOfStreamException: Unable to read additional data from client sessionid 0x143eae55a2e0008, likely client has closed socket at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:220) at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:224) at java.lang.Thread.run(Thread.java:701) Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.501 sec 2014-02-01 00:42:39,613 [RegionServer:0;vesta.apache.org,52128,1391215337670] WARN org.apache.hadoop.hbase.regionserver.HRegionServer - Received close for region we are already opening or closing; 70236052 Running org.apache.james.system.hbase.TablePoolTest SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-log4j12/1.7.2/slf4j-log4j12-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-simple/1.7.2/slf4j-simple-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2014-02-01 00:42:43,895 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. 2014-02-01 00:42:46,558 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:42:46,591 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:42:46,603 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. Starting DataNode 0 with dfs.data.dir: <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/test-data/d9ccdaa3-51ed-42b3-96ba-c975e792f364/dfscluster_f074dc4e-cdbd-4fdf-999d-b8b158230f12/dfs/data/data1,/home/jenkins/jenkins-slave/workspace/james-server-trunk/trunk/data/data-hbase/target/test-data/d9ccdaa3-51ed-42b3-96ba-c975e792f364/dfscluster_f074dc4e-cdbd-4fdf-999d-b8b158230f12/dfs/data/data2> 2014-02-01 00:42:50,301 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - NameNode metrics system already initialized! 2014-02-01 00:42:50,301 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:42:51,529 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name jvm already exists! Cluster is active 2014-02-01 00:42:56,429 [org.apache.hadoop.hdfs.server.datanode.DataXceiver@4b7d03c5] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2014-02-01 00:43:07,345 [Master:0;vesta.apache.org,53268,1391215374746] WARN org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper - Node /hbase/root-region-server already deleted, and this is not a retry Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.406 sec 2014-02-01 00:43:23,407 [RegionServer:0;vesta.apache.org,36186,1391215376255] WARN org.apache.hadoop.hbase.regionserver.HRegionServer - Received close for region we are already opening or closing; 70236052 Running org.apache.james.user.hbase.HBaseUsersRepositoryTest SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-log4j12/1.7.2/slf4j-log4j12-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/slf4j/slf4j-simple/1.7.2/slf4j-simple-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2014-02-01 00:43:30,685 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. 2014-02-01 00:43:34,773 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:43:34,804 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:43:34,816 [main] WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem - The dfs.support.append option is in your configuration, however append is not supported. This configuration option is no longer required to enable sync. Starting DataNode 0 with dfs.data.dir: <https://builds.apache.org/job/james-server-trunk/org.apache.james$james-server-data-hbase/ws/target/test-data/efac57f0-7525-4f07-bd64-56efae58d4e2/dfscluster_03f99a21-3f88-48b2-a332-845baa4d7d5e/dfs/data/data1,/home/jenkins/jenkins-slave/workspace/james-server-trunk/trunk/data/data-hbase/target/test-data/efac57f0-7525-4f07-bd64-56efae58d4e2/dfscluster_03f99a21-3f88-48b2-a332-845baa4d7d5e/dfs/data/data2> 2014-02-01 00:43:38,510 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - NameNode metrics system already initialized! 2014-02-01 00:43:38,510 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name ugi already exists! 2014-02-01 00:43:40,321 [main] WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl - Source name jvm already exists! Cluster is active 2014-02-01 00:43:43,256 [org.apache.hadoop.hdfs.server.datanode.DataXceiver@35f6ef01] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2014-02-01 00:43:53,594 [Master:0;vesta.apache.org,44535,1391215422187] WARN org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper - Node /hbase/root-region-server already deleted, and this is not a retry Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.729 sec 2014-02-01 00:44:09,451 [RegionServer:0;vesta.apache.org,35378,1391215423044] WARN org.apache.hadoop.hbase.regionserver.HRegionServer - Received close for region we are already opening or closing; 70236052 Results : Tests in error: JUnit4Provider.invoke:124->executeTestSet:153->execute:264 ? NoClassDefFound C... Tests run: 18, Failures: 0, Errors: 1, Skipped: 0 [JENKINS] Recording test results --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
