Thanks. I think this problem have been solved. The 50090 port has been bind by other application
I add several lines in hdfs-site.xml as following: <configuration> <property> <name>dfs.secondary.http.address</name> <value>0.0.0.0:8002</value> </property> </configuration> On 01/13/11 21:55, rahul patodi wrote: > Hi, > The error you are getting is due to your port is not free please check > it as Harsh told. > Another problem is with your configuration file > if you setup hadoop cluster then there should not be localhost. > you files should look like: > > core-site.xml > > <property> > <name>fs.default.name <http://fs.default.name></name> > <value>hdfs://master:54310</value> > </property> > > mapred-site.xml > > <property> > <name>mapred.job.tracker</name> > <value>master:54311</value> > </property> > > hdfs-site.xml > > <property> > <name>dfs.replication</name> > <value>2</value> > </property> > you can > visit > http://hadoop-tutorial.blogspot.com/2010/11/running-hadoop-in-distributed-mode.html > > -- > *Regards*, > Rahul Patodi > Software Engineer, > Impetus Infotech (India) Pvt Ltd, > www.impetus.com <http://www.impetus.com> > Mob:09907074413