The contents are 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
On Sun, Aug 3, 2014 at 11:21 PM, Ritesh Kumar Singh < [email protected]> wrote: > check the contents of '/etc/hosts' file > > > On Mon, Aug 4, 2014 at 3:27 AM, S.L <[email protected]> wrote: > >> Hi All, >> >> I am trying to set up a Apache Hadoop 2.3.0 cluster , I have a master and >> three slave nodes , the slave nodes are listed in the >> $HADOOP_HOME/etc/hadoop/slaves file and I can telnet from the slaves to the >> Master Name node on port 9000, however when I start the datanode on any of >> the slaves I get the following exception . >> >> 2014-08-03 08:04:27,952 FATAL >> org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for >> block pool Block pool BP-1086620743-170.75.152.162-1407064313305 (Datanode >> Uuid null) service to server1.dealyaft.com/170.75.152.162:9000 >> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException): >> Datanode denied communication with namenode because hostname cannot be >> resolved . >> >> The following are the contents of my core-site.xml. >> >> <configuration> >> <property> >> <name>fs.default.name</name> >> <value>hdfs://server1.mydomain.com:9000</value> >> </property> >> </configuration> >> >> Also in my hdfs-site.xml I am not setting any value for dfs.hosts or >> dfs.hosts.exclude properties. >> >> Thanks. >> > >
