hi, see the du and df command output below.
[hadoop@master-26161 hadoop]$ /opt/hadoop-2.4.1/bin/hdfs dfs -du -h / 14/09/25 01:35:03 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 91.9 G /hbase 492.7 M /var 390.9 M /tmp 4.3 M /user [hadoop@master-26161 hadoop]$ /opt/hadoop-2.4.1/bin/hdfs dfs -df -h 14/09/25 01:41:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Filesystem Size Used Available Use% hdfs://master-26151:9000 1.3 T 642.8 G 711.0 G 47% you can see "du" say there are 93G data under "/", but "df" says we used 642.8G. we set replica facotr is 3. using 93*3=279G would make sense. so my question: who used other disks? how to clean them? Thanks. Tang
