Hi, I tried refreshing nodes . I even restarted both job tracker and name node . It does not seem to change . I am using hadoop-1.0.3 . Does the version make a difference ??
Thank you On Tue, Mar 26, 2013 at 10:38 AM, bharath vissapragada < [email protected]> wrote: > Hi, > > Try running 'hadoop dfsadmin -refreshNodes' ! Your NN might have > cached previously set values! > > Thanks, > > > On Tue, Mar 26, 2013 at 10:31 AM, preethi ganeshan > <[email protected]> wrote: > > Hi, > > > > I used this script . In core-site.xml i have set > > net.topology.script.file.name to this file's path. Then i executed the > > script and passed my computers IP address. It returned /dc1/rack1 . > However > > , when i ran my MapReduce job it still says the job ran on default-rack . > > How can i change that?? > > Thank you > > Regards, > > Preethi Ganeshan > > > > > > ( I have made the changes accordingly to fit my computer ) > > > > HADOOP_CONF=/etc/hadoop/conf > > > > while [ $# -gt 0 ] ; do > > nodeArg=$1 > > exec< ${HADOOP_CONF}/topology.data > > result="" > > while read line ; do > > ar=( $line ) > > if [ "${ar[0]}" = "$nodeArg" ] ; then > > result="${ar[1]}" > > fi > > done > > shift > > if [ -z "$result" ] ; then > > echo -n "/default/rack " > > else > > echo -n "$result " > > fi > > done > > > > Topology data > > > > hadoopdata1.ec.com /dc1/rack1 > > hadoopdata1 /dc1/rack1 > > 10.1.1.1 /dc1/rack2 >
