did you configure fs.default.name with DFS address. 
you might have configured file:///.
if yes, please update dfs address hdfs://xx.xx.xx.xx:port and try.

you ned to add this in core-site.xml file.

Regards,
Uma 
----- Original Message -----
From: Joey Echeverria <[email protected]>
Date: Tuesday, November 8, 2011 5:37 pm
Subject: Re: Hadoop PseudoDistributed configuration
To: "[email protected]" <[email protected]>
Cc: "[email protected]" <[email protected]>

> What is your setting for fs.default.name?
> 
> -Joey
> 
> On Nov 8, 2011, at 5:54, Paolo Di Tommaso 
> <[email protected]> wrote:
> 
> > Dear all,
> > 
> > I'm trying to install Hadoop (0.20.2) in pseudo distributed mode 
> to run
> > some tests on a Linux machine (Fedora 8) .
> > 
> > I have followed the installation steps in the guide available here
> > 
> http://hadoop.apache.org/common/docs/current/single_node_setup.html#PseudoDistributed>
>  
> > 
> > The daemons start with no problem, but when I access the HDFS file
> > system (hadoop
> > fs -ls /) it shows all the content of the underling (real) file 
> system.> This seems really strange to me because I'm expecting 
> that HDFS should work
> > as an independent file system.
> > 
> > 
> > Does anybody had the same problem? Any suggestions to check 
> where I'm
> > failing to configure Hadoop?
> > 
> > 
> > Thank you,
> > Paolo
> 

Reply via email to