Hi,

Tiger Uppercut a écrit :
<snip/>

<property>
 <name>fs.default.name</name>
 <value>localhost:9000</value>
</property>

<!-- map/reduce properties -->

<property>
 <name>mapred.job.tracker</name>
 <value>localhost:9001</value>
</property>

For the fs.default.name and the mapred.job.tracker try to use the hostname of your machine instead of localhost. When using localhost:XXXX, hadoop servers are listen to the loopback interface. But mapreduce jobs (I do not know exactly where) are seeing that the connections to tasktrackers are issued using the 127.0.0.1 and are trying to reverse dns the adress. Your system will not return localhost but the real name of your machine. In most linux system, that name is binded to an ethernet interface so jobs will try to connect to that interface instead of the loopback one.



<property>
 <name>dfs.name.dir</name>
 <value>/some_dir/hadoop/hadoop_data</value>
</property>

</configuration>

Reply via email to