See
<https://builds.apache.org/job/Blur-master-jdk7/org.apache.blur$blur-store/667/>
------------------------------------------
[...truncated 338 lines...]
INFO 20140826_11:36:41:006_UTC [main] hdfs_v2.HdfsKeyValueStore: Rolling file
[hdfs://localhost:34114/test/000000000002]
INFO 20140826_11:36:41:410_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:34114/test/000000000003].
ERROR 20140826_11:36:41:412_UTC [IPC Server handler 0 on 34114]
security.UserGroupInformation: PriviledgedActionException as:jenkins
cause:java.io.IOException: failed to create file /test/000000000003 on client
127.0.0.1 either because the filename is invalid or the file exists
ERROR 20140826_11:36:41:412_UTC [IPC Server handler 0 on 34114]
security.UserGroupInformation: PriviledgedActionException as:jenkins
cause:java.io.IOException: failed to create file /test/000000000003 on client
127.0.0.1 either because the filename is invalid or the file exists
INFO 20140826_11:36:41:414_UTC [main] hdfs_v2.HdfsKeyValueStore: Rolling file
[hdfs://localhost:34114/test/000000000004]
INFO 20140826_11:36:41:419_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:34114/test/000000000005].
INFO 20140826_11:36:41:428_UTC [main] hdfs_v2.HdfsKeyValueStore: Rolling file
[hdfs://localhost:34114/test/000000000005]
INFO 20140826_11:36:41:834_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:34114/test/000000000006].
1000
1000
1000
ERROR 20140826_11:36:41:904_UTC [HDFS KV Store [hdfs://localhost:34114/test]]
hdfs_v2.HdfsKeyValueStore: Unknown error while trying to clean up old files.
java.io.IOException: No longer the owner of [hdfs://localhost:34114/test]
at
org.apache.blur.store.hdfs_v2.HdfsKeyValueStore.cleanupOldFiles(HdfsKeyValueStore.java:383)
at
org.apache.blur.store.hdfs_v2.HdfsKeyValueStore$3.run(HdfsKeyValueStore.java:219)
at java.lang.Thread.run(Thread.java:724)
INFO 20140826_11:36:41:928_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:34114/test/000000000001].
INFO 20140826_11:36:41:960_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:34114/test/000000000001].
INFO 20140826_11:36:41:994_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:34114/test/000000000001].
INFO 20140826_11:36:42:028_UTC [main] blur.HdfsMiniClusterUtil: Shutting down
Mini DFS
Shutting down the Mini HDFS Cluster
Shutting down DataNode 0
INFO 20140826_11:36:42:029_UTC [main] mortbay.log: Stopped
SelectChannelConnector@localhost:0
ERROR 20140826_11:36:42:030_UTC [HDFS KV Store [hdfs://localhost:34114/test]]
hdfs_v2.HdfsKeyValueStore: Unknown error while trying to clean up old files.
java.io.IOException: No longer the owner of [hdfs://localhost:34114/test]
at
org.apache.blur.store.hdfs_v2.HdfsKeyValueStore.cleanupOldFiles(HdfsKeyValueStore.java:383)
at
org.apache.blur.store.hdfs_v2.HdfsKeyValueStore$3.run(HdfsKeyValueStore.java:219)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:135_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@1787eea] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:135_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@1787eea] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:137_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@6573f3] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:137_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@6573f3] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:137_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@17c87a4] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.InterruptedIOException: Interruped while waiting for IO on channel
java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.DataInputStream.read(DataInputStream.java:149)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:293)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:340)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:582)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:137_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@17c87a4] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.InterruptedIOException: Interruped while waiting for IO on channel
java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.DataInputStream.read(DataInputStream.java:149)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:293)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:340)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:582)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:137_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@1316b06] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:137_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@1316b06] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:135_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@14e60d2] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:135_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@14e60d2] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.IOException: Interrupted receiveBlock
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:626)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:138_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@d41caa] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.InterruptedIOException: Interruped while waiting for IO on channel
java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.DataInputStream.read(DataInputStream.java:149)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:293)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:340)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:582)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:42:138_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@d41caa] datanode.DataNode:
DatanodeRegistration(127.0.0.1:57631,
storageID=DS-696335807-67.195.81.148-57631-1409052998931, infoPort=36729,
ipcPort=60372):DataXceiver
java.io.InterruptedIOException: Interruped while waiting for IO on channel
java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.DataInputStream.read(DataInputStream.java:149)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:293)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:340)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:582)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
INFO 20140826_11:36:43:139_UTC [main] mortbay.log: Stopped
SelectChannelConnector@localhost:0
INFO 20140826_11:36:43:252_UTC [main] blur.HdfsMiniClusterUtil: Shutting down
FileSystem
INFO 20140826_11:36:43:252_UTC [main] blur.HdfsMiniClusterUtil: Stopping
ThreadPoolExecutor [pool-1-thread-1]
INFO 20140826_11:36:43:254_UTC [main] blur.HdfsMiniClusterUtil: Waiting for
thread pool to exit [pool-1-thread-1]
INFO 20140826_11:36:43:255_UTC [main] blur.HdfsMiniClusterUtil: Stopping
ThreadPoolExecutor [pool-2-thread-1]
INFO 20140826_11:36:43:255_UTC [main] blur.HdfsMiniClusterUtil: Waiting for
thread pool to exit [pool-2-thread-1]
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.463 sec
Running org.apache.blur.store.hdfs_v2.FastHdfsKeyValueDirectoryTest
INFO 20140826_11:36:44:068_UTC [main] blur.HdfsMiniClusterUtil:
dfs.datanode.data.dir.perm=755
INFO 20140826_11:36:44:710_UTC [main] mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
INFO 20140826_11:36:44:789_UTC [main] mortbay.log: jetty-6.1.26
INFO 20140826_11:36:44:830_UTC [main] mortbay.log: Extract
jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/hdfs
to /tmp/Jetty_localhost_60530_hdfs____ikiubm/webapp
INFO 20140826_11:36:45:137_UTC [main] mortbay.log: Started
SelectChannelConnector@localhost:60530
Starting DataNode 0 with dfs.data.dir:
<https://builds.apache.org/job/Blur-master-jdk7/org.apache.blur$blur-store/ws/target/target/tmp/dfs/data/data1,/home/jenkins/jenkins-slave/workspace/Blur-master-jdk7/blur-store/target/target/tmp/dfs/data/data2>
INFO 20140826_11:36:45:290_UTC [main] mortbay.log: jetty-6.1.26
INFO 20140826_11:36:45:306_UTC [main] mortbay.log: Extract
jar:file:/home/jenkins/jenkins-slave/maven-repositories/1/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/datanode
to /tmp/Jetty_localhost_47219_datanode____mwdty6/webapp
INFO 20140826_11:36:45:518_UTC [main] mortbay.log: Started
SelectChannelConnector@localhost:47219
Cluster is active
Cluster is active
INFO 20140826_11:36:45:846_UTC [main] hdfs_v2.HdfsKeyValueStore: Opening for
writing [hdfs://localhost:59764/test/test_multiple/000000000001].
INFO 20140826_11:36:45:905_UTC [main] hdfs_v2.FastHdfsKeyValueDirectory:
Running GC over the hdfs kv directory
[hdfs://localhost:59764/test/test_multiple].
INFO 20140826_11:36:46:099_UTC [main] blur.HdfsMiniClusterUtil: Shutting down
Mini DFS
Shutting down the Mini HDFS Cluster
Shutting down DataNode 0
INFO 20140826_11:36:46:099_UTC [main] mortbay.log: Stopped
SelectChannelConnector@localhost:0
ERROR 20140826_11:36:46:203_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@132aed6] datanode.DataNode:
DatanodeRegistration(127.0.0.1:34365,
storageID=DS-1071971687-67.195.81.148-34365-1409053005543, infoPort=47219,
ipcPort=50582):DataXceiver
java.io.InterruptedIOException: Interruped while waiting for IO on channel
java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.DataInputStream.read(DataInputStream.java:149)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:293)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:340)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:582)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
ERROR 20140826_11:36:46:203_UTC
[org.apache.hadoop.hdfs.server.datanode.DataXceiver@132aed6] datanode.DataNode:
DatanodeRegistration(127.0.0.1:34365,
storageID=DS-1071971687-67.195.81.148-34365-1409053005543, infoPort=47219,
ipcPort=50582):DataXceiver
java.io.InterruptedIOException: Interruped while waiting for IO on channel
java.nio.channels.SocketChannel[closed]. 0 millis timeout left.
at
org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.DataInputStream.read(DataInputStream.java:149)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:293)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:340)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:582)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
at java.lang.Thread.run(Thread.java:724)
INFO 20140826_11:36:47:203_UTC [main] mortbay.log: Stopped
SelectChannelConnector@localhost:0
INFO 20140826_11:36:47:316_UTC [main] blur.HdfsMiniClusterUtil: Shutting down
FileSystem
INFO 20140826_11:36:47:317_UTC [main] blur.HdfsMiniClusterUtil: Stopping
ThreadPoolExecutor [pool-1-thread-1]
INFO 20140826_11:36:47:317_UTC [main] blur.HdfsMiniClusterUtil: Waiting for
thread pool to exit [pool-1-thread-1]
INFO 20140826_11:36:47:317_UTC [main] blur.HdfsMiniClusterUtil: Stopping
ThreadPoolExecutor [pool-2-thread-1]
INFO 20140826_11:36:47:318_UTC [main] blur.HdfsMiniClusterUtil: Waiting for
thread pool to exit [pool-2-thread-1]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.841 sec
Running org.apache.blur.store.blockcache.BlockDirectoryTest
Seed is -6136682278208007083
WARN 20140826_11:36:48:123_UTC [main] buffer.BufferStore: Buffer store for
size [1,024] already setup.
WARN 20140826_11:36:48:127_UTC [main] buffer.BufferStore: Buffer store for
size [8,192] already setup.
Seed is 8126605831377026233
Total time is 4303ms
WARN 20140826_11:36:52:432_UTC [main] buffer.BufferStore: Buffer store for
size [1,024] already setup.
WARN 20140826_11:36:52:432_UTC [main] buffer.BufferStore: Buffer store for
size [8,192] already setup.
Seed is -3940116796497214442
Total time is 2579ms
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.513 sec
Running org.apache.blur.store.blockcache.BlockCacheTest
Cache Hits = 4880
Cache Misses = 5120
Store = avg 0.0126526431 ms
Fetch = avg 0.0040589949000000005 ms
# of Elements = 4095
Cache Hits = 4684
Cache Misses = 5316
Store = avg 0.0125418164 ms
Fetch = avg 0.0040066423 ms
# of Elements = 4095
Cache Hits = 4691
Cache Misses = 5309
Store = avg 0.0133659702 ms
Fetch = avg 0.0041591975000000005 ms
# of Elements = 4095
Cache Hits = 4704
Cache Misses = 5296
Store = avg 0.0133725843 ms
Fetch = avg 0.004117600999999999 ms
# of Elements = 4095
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.128 sec
Running org.apache.blur.store.blockcache.BlockDirectoryCacheTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.336 sec
Results :
Tests run: 32, Failures: 0, Errors: 0, Skipped: 0
[JENKINS] Recording test results