Hi Nitin,


Normally your conf should reside in /etc/hadoop/conf (if you don't have one. 
Copy it from the namenode - and keep it sync)



hadoop (script) by default depends on hadoop-setup.sh which depends on 
hadoop-env.sh in /etc/hadoop/conf



Or during runtime specify the config dir

i.e:



[hdfs]$  hadoop [--config <path to your config dir>] <commands>





P.S. Some useful links:

http://wiki.apache.org/hadoop/FAQ

http://wiki.apache.org/hadoop/FrontPage

http://wiki.apache.org/hadoop/

http://hadoop.apache.org/common/docs/r1.0.3/









-----Original Message-----
From: d...@paraliatech.com [mailto:d...@paraliatech.com] On Behalf Of Dave Beech
Sent: Friday, July 13, 2012 6:18 AM
To: common-user@hadoop.apache.org
Subject: Re: hadoop dfs -ls



Hi Nitin



It's likely that your hadoop command isn't finding the right configuration.

In particular it doesn't know where your namenode is (fs.default.namesetting in 
core-site.xml)



Maybe you need to set the HADOOP_CONF_DIR environment variable to point to your 
conf directory.



Dave



On 13 July 2012 14:11, Nitin Pawar 
<nitinpawar...@gmail.com<mailto:nitinpawar...@gmail.com>> wrote:



> Hi,

>

> I have done setup numerous times but this time i did after some break.

>

> I managed to get the cluster up and running fine but when I do  hadoop

> dfs -ls /

>

> it actually shows me contents of linux file system

>

> I am using hadoop-1.0.3 on rhel5.6

>

> Can anyone suggest what I must have done wrong?

>

> --

> Nitin Pawar

>

Reply via email to