See https://builds.apache.org/job/Hadoop-Hdfs-trunk/744/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE 
###########################
[...truncated 1082561 lines...]
    [junit] 2011-08-09 13:31:14,572 WARN  blockmanagement.BlockManager 
(BlockManager.java:run(2604)) - ReplicationMonitor thread received 
InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit]     at java.lang.Thread.sleep(Native Method)
    [junit]     at 
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2602)
    [junit]     at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-09 13:31:14,572 INFO  namenode.FSEditLog 
(FSEditLog.java:endCurrentLogSegment(822)) - Ending log segment 1
    [junit] 2011-08-09 13:31:14,572 WARN  blockmanagement.DecommissionManager 
(DecommissionManager.java:run(75)) - Monitor interrupted: 
java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-09 13:31:14,581 INFO  namenode.FSEditLog 
(FSEditLog.java:printStatistics(501)) - Number of transactions: 8 Total time 
for transactions(ms): 1Number of transactions batched in Syncs: 0 Number of 
syncs: 7 SyncTimes(ms): 45 38 
    [junit] 2011-08-09 13:31:14,583 INFO  ipc.Server (Server.java:stop(1715)) - 
Stopping server on 60052
    [junit] 2011-08-09 13:31:14,583 INFO  ipc.Server (Server.java:run(1539)) - 
IPC Server handler 0 on 60052: exiting
    [junit] 2011-08-09 13:31:14,584 INFO  ipc.Server (Server.java:run(505)) - 
Stopping IPC Server listener on 60052
    [junit] 2011-08-09 13:31:14,584 INFO  ipc.Server (Server.java:run(647)) - 
Stopping IPC Server Responder
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
NameNodeActivity
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcActivityForPort60052
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcDetailedActivityForPort60052
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcActivityForPort54789
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcDetailedActivityForPort54789
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
DataNodeActivity-janus.apache.org-41739
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcActivityForPort33237
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcDetailedActivityForPort33237
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
DataNodeActivity-janus.apache.org-34160
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcActivityForPort58558
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcDetailedActivityForPort58558
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
DataNodeActivity-janus.apache.org-58667
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcActivityForPort39400
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
RpcDetailedActivityForPort39400
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-09 13:31:14,589 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source 
DataNodeActivity-janus.apache.org-39667
    [junit] 2011-08-09 13:31:14,589 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-09 13:31:14,589 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown 
complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.248 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: 
Tests failed!

Total time: 118 minutes 43 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2238
Updating HDFS-2230
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) 
##############################
7 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend

Error Message:
testComplexAppend Worker encountered exceptions.

Stack Trace:
junit.framework.AssertionFailedError: testComplexAppend Worker encountered 
exceptions.
        at 
org.apache.hadoop.hdfs.TestFileAppend2.__CLR2_4_3dvc5331967(TestFileAppend2.java:387)
        at 
org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend(TestFileAppend2.java:332)


REGRESSION:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint 
directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory 
/test/dfs/namesecondary is in an inconsistent state: checkpoint directory does 
not exist or is not accessible.
        at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
        at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
        at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
        at 
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
        at 
org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
        at 
org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p150k(TestHDFSServerPorts.java:350)
        at 
org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


REGRESSION:  
org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
        at 
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
        at 
org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
        at 
org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:627)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:542)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:258)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:86)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:244)
        at 
org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1h6y(TestCheckpoint.java:560)
        at 
org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)


REGRESSION:  
org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
        at 
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
        at 
org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
        at 
org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f1u(TestNNThroughputBenchmark.java:39)
        at 
org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35)


REGRESSION:  
org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
        at 
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
        at 
org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
        at 
org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bdv(TestValidateConfigurationSettings.java:49)
        at 
org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43)


REGRESSION:  
org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
        at 
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
        at 
org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
        at 
org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
        at 
org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
        at 
org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1be5(TestValidateConfigurationSettings.java:71)
        at 
org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)


FAILED:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time 
until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in 
the report does not reflect the time until the timeout.



Reply via email to