On 16/08/11 11:19, A Df wrote:
See inline



________________________________
From: Steve Loughran<[email protected]>
To: [email protected]
Sent: Tuesday, 16 August 2011, 11:08
Subject: Re: hadoop cluster mode not starting up

On 16/08/11 11:02, A Df wrote:
Hello All:

I used a combination of tutorials to setup hadoop but most seems to be using 
either an old version of hadoop or only using 2 machines for the cluster which 
isn't really a cluster. Does anyone know of a good tutorial which setups 
multiple nodes for a cluster?? I already looked at the Apache website but it 
does not give sample values for the conf files. Also each set of tutorials seem 
to have a different set of parameters which they indicate should be changed so 
now its a bit confusing. For example, my configuration sets a dedicate 
namenode, secondary namenode and 8 slave nodes but when I run the start command 
it gives an error. Should I install hadoop to my user directory or on the root? 
I have it in my directory but all the nodes have a central file system as 
opposed to distributed so whatever I do on one node in my user folder it affect 
all the others so how do i set the paths to ensure that it uses a distributed 
system?

For the errors below, I checked the directories and the files are there. Am I 
not sure what went wrong and how to set the conf to not have central file 
system. Thank you.

Error message
CODE
w1153435@n51:~/hadoop-0.20.2_cluster>   bin/start-dfs.sh
bin/start-dfs.sh: line 28: 
/w1153435/hadoop-0.20.2_cluster/bin/hadoop-config.sh: No such file or directory
bin/start-dfs.sh: line 50: 
/w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemon.sh: No such file or directory
bin/start-dfs.sh: line 51: 
/w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh: No such file or directory
bin/start-dfs.sh: line 52: 
/w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh: No such file or directory
CODE

there's  No such file or directory as
/w1153435/hadoop-0.20.2_cluster/bin/hadoop-daemons.sh


There is, I checked as shown
w1153435@n51:~/hadoop-0.20.2_cluster>  ls bin
hadoop             rcc                start-dfs.sh      stop-dfs.sh
hadoop-config.sh   slaves.sh          start-mapred.sh   stop-mapred.sh
hadoop-daemon.sh   start-all.sh       stop-all.sh
hadoop-daemons.sh  start-balancer.sh  stop-balancer.sh

try "pwd" to print out where the OS thinks you are, as it doesn't seem to be where you think you are






I had tried running this command below earlier but also got problems:
CODE
w1153435@ngs:~/hadoop-0.20.2_cluster>   export 
HADOOP_CONF_DIR=${HADOOP_HOME}/conf
w1153435@ngs:~/hadoop-0.20.2_cluster>   export 
HADOOP_SLAVES=${HADOOP_CONF_DIR}/slaves
w1153435@ngs:~/hadoop-0.20.2_cluster>   ${HADOOP_HOME}/bin/slaves.sh "mkdir -p 
/home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
-bash: /bin/slaves.sh: No such file or directory
w1153435@ngs:~/hadoop-0.20.2_cluster>   export 
HADOOP_HOME=/home/w1153435/hadoop-0.20.2_cluster
w1153435@ngs:~/hadoop-0.20.2_cluster>   ${HADOOP_HOME}/bin/slaves.sh "mkdir -p 
/home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
cat: /conf/slaves: No such file or directory
CODE

there's  No such file or directory as /conf/slaves because you set
HADOOP_HOME after setting the other env variables, which are expanded at
set-time, not run-time.

I redid the command but still have errors on the slaves


w1153435@n51:~/hadoop-0.20.2_cluster>  export 
HADOOP_HOME=/home/w1153435/hadoop-0.20.2_cluster
w1153435@n51:~/hadoop-0.20.2_cluster>  export 
HADOOP_CONF_DIR=${HADOOP_HOME}/conf
w1153435@n51:~/hadoop-0.20.2_cluster>  export 
HADOOP_SLAVES=${HADOOP_CONF_DIR}/slaves
w1153435@n51:~/hadoop-0.20.2_cluster>  ${HADOOP_HOME}/bin/slaves.sh "mkdir -p 
/home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop"
privn51: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn58: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn52: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn55: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn57: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn54: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn53: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory
privn56: bash: mkdir -p /home/w1153435/hadoop-0.20.2_cluster/tmp/hadoop: No 
such file or directory

try ssh-ing in, do it by hand, make sure you have the right permissions etc

Reply via email to