Thanks, Todd. That gave a hint of what exactly was wrong. The permission on conf directory was wrong.

On Jul 17, 2009, at 1:38 PM, Todd Lipcon wrote:

Hi Vijay,

It sounds like your user is seeing a different configuration than the hadoop
user, such that fs.default.name isn't getting set. Did you set a
HADOOP_CONF_DIR environment variable in the bashrc of the user that runs the hadoop process? If so, you should instead set that variable in hadoop-env.sh

-Todd


On Fri, Jul 17, 2009 at 11:09 AM, Vijay Kumar Adhikari
<[email protected]>wrote:

I have an installation of hadoop where when I log into the account
that runs the hadoop process, everything looks fine. I can copy local
files to the dfs, I can view the files inside dfs etc etc...

When I log into a different user, I can still run all the dfs commands but dfs shows files from my current local directory. If I am inside my
home directory in the local machine, and issue "hadoop dfs -ls" it
lists all the files inside my local home directory. If I change to
some other local directory and issue the same command all files from
that directory are shown. When I issue "hadoop dfs -copyFromLocal
<filename>", it says the uri already exists.

What is wrong here? What do I need to fix?

--
Vijay


Reply via email to