fs -du has a 'h' option for human readble values but it doesn't seem to
work. Instead you can use something like this to print in gigs. Adjust the
1024 multiplier for other forms.
hadoop fs -du / | awk '{print ($1/(1024*1024*1024))"g" "\t" $2}'
On 10/4/10 2:04 AM, "[email protected]"
<[email protected]> wrote:
> From: Sandhya E <[email protected]>
> Date: Sat, 2 Oct 2010 23:36:55 +0530
> To: <[email protected]>
> Subject: Re: Total Space Available on Hadoop Cluster Or Hadoop version of
> "df".
>
> There is a fs -du command that can be useful. Or the Hadoop DFS
> website shows the stats also.
>
> On Sat, Oct 2, 2010 at 9:44 AM, rahul <[email protected]> wrote:
>> Hi,
>>
>> I am using Hadoop 0.20.2 version for data processing by setting up Hadoop
>> Cluster on two nodes.
>>
>> And I am continuously adding more space to the nodes.
>>
>> Can some body let me know how to get the total space available on the hadoop
>> cluster using command line.
>>
>> or
>>
>> Hadoop version "df", Unix command.
>>
>> Any input is helpful.
>>
>> Thanks
>> Rahul
iCrossing Privileged and Confidential Information
This email message is for the sole use of the intended recipient(s) and may
contain confidential and privileged information of iCrossing. Any unauthorized
review, use, disclosure or distribution is prohibited. If you are not the
intended recipient, please contact the sender by reply email and destroy all
copies of the original message.