Hello, everybody.
Just installed 0.20 version on 4 nodes. No matter how I configure
(pretty much standard though), it is always says configured capacity
is 0KB and 100% space used. Any try to put a file ends up with an
empty file of 0. Just for a record, all tmp and hdfs image redirected
to a solid space and there are plenty of hundreds of gigabytes.

Am I missing something or this is new "feature"? :-)


$ hadoop dfsadmin -report

Configured Capacity: 0 (0 KB)
Present Capacity: 9216 (9 KB)
DFS Remaining: 0 (0 KB)
DFS Used: 9216 (9 KB)
DFS Used%: 100%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0

-------------------------------------------------
Datanodes available: 3 (3 total, 0 dead)

Name: 192.168.1.242:50010
Decommission Status : Normal
Configured Capacity: 0 (0 KB)
DFS Used: 3072 (3 KB)
Non DFS Used: 0 (0 KB)
DFS Remaining: 0(0 KB)
DFS Used%: 100%
DFS Remaining%: 0%
Last contact: Thu Jul 02 01:47:21 JST 2009


Name: 192.168.1.241:50010
Decommission Status : Normal
Configured Capacity: 0 (0 KB)
DFS Used: 3072 (3 KB)
Non DFS Used: 0 (0 KB)
DFS Remaining: 0(0 KB)
DFS Used%: 100%
DFS Remaining%: 0%
Last contact: Thu Jul 02 01:47:23 JST 2009


Name: 192.168.1.243:50010
Decommission Status : Normal
Configured Capacity: 0 (0 KB)
DFS Used: 3072 (3 KB)
Non DFS Used: 0 (0 KB)
DFS Remaining: 0(0 KB)
DFS Used%: 100%
DFS Remaining%: 0%
Last contact: Thu Jul 02 01:47:23 JST 2009



-- 
Kind regards

Things, that are stupid at the beginning, rarely ends up wisely.

Reply via email to