I have done this and found following error in log -

2013-05-08 18:53:45,221 WARN org.apache.hadoop.net.ScriptBasedMapping:
Exception running
/home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh
127.0.0.1
org.apache.hadoop.util.Shell$ExitCodeException:
/home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh: 8:
/home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh:
Syntax error: "(" unexpected (expecting "done")

        at org.apache.hadoop.util.Shell.runCommand(Shell.java:202)
        at org.apache.hadoop.util.Shell.run(Shell.java:129)
        at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:322)
        at 
org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:241)
        at 
org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:179)
        at 
org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:119)
        at 
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.resolveNetworkLocation(DatanodeManager.java:454)
        at 
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:713)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3459)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
        at 
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
        at 
org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1735)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1731)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1441)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1729)
2013-05-08 18:53:45,223 ERROR
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: The
resolve call returned null! Using /default-rack for host [127.0.0.1]



On Wed, May 8, 2013 at 7:18 PM, Leonid Fedotov <[email protected]>wrote:

> You can put this parameter to core-site.xml or hdfs-site.xml
> It both parsed during the HDFS startup.
>
> Leonid
>
>
> On Wed, May 8, 2013 at 6:43 AM, Mohammad Mustaqeem <[email protected]
> > wrote:
>
>> Hello everyone,
>>     I was searching for how to make the hadoop cluster rack-aware and I
>> find out from here
>> http://hadoop.apache.org/docs/r2.0.4-alpha/hadoop-project-dist/hadoop-common/ClusterSetup.html#Hadoop_Rack_Awareness
>>  that
>> we can do this by giving property of "topology.script.file.name". But
>> here it is not written where to put this
>> <property>
>>         <name>topology.script.file.name</name>
>>
>> <value>/home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh</value>
>> </property>
>>
>> Means in which configuration file.
>> I am using hadoop-2.0.3-alpha.
>>
>>
>> --
>> *With regards ---*
>> *Mohammad Mustaqeem*,
>> M.Tech (CSE)
>> MNNIT Allahabad
>> 9026604270
>>
>>
>>
>


-- 
*With regards ---*
*Mohammad Mustaqeem*,
M.Tech (CSE)
MNNIT Allahabad
9026604270

Reply via email to