Raghu Angadi wrote:

This is at RPC client level and there is requirement for fully qualified

I meant to say "there is NO requirement ..."

hostname. May be "." at the end of "10.2.24.21" causing the problem?

btw, in 0.21 even fs.default.name does not need to be fully qualified

that fix is probably in 0.20 too.

Raghu.

name.. anything that resolves to an ipaddress is fine (at least for common/FS and HDFS).

Raghu.

Matt Massie wrote:
fs.default.name in your hadoop-site.xml needs to be set to a fully-qualified domain name (instead of an IP address)

-Matt

On Jun 23, 2009, at 6:42 AM, bharath vissapragada wrote:

when i try to execute the command bin/start-dfs.sh , i get the following error . I have checked the hadoop-site.xml file on all the nodes , and they
are fine ..
can some-one help me out!

10.2.24.21: Exception in thread "main" java.net.UnknownHostException:
unknown host: 10.2.24.21.
10.2.24.21:     at
org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
10.2.24.21:     at
org.apache.hadoop.ipc.Client.getConnection(Client.java:779)
10.2.24.21:     at org.apache.hadoop.ipc.Client.call(Client.java:704)
10.2.24.21: at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216) 10.2.24.21: at org.apache.hadoop.dfs.$Proxy4.getProtocolVersion(Unknown
Source)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
10.2.24.21:     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)




Reply via email to