Hello Mahmood Naderan,
When client is trying to connect to server with configured port(and address)
and server is not started with that port then client will retry ( and you will
get following error..)..
I can able to trace that Namenode is not running From JPS report which you had
posted..Please check the namenode logs ( location :
/home/mahmood/bigdatabench/apache/hadoop-1.0.2/libexec/../logs/hadoop-mahmood-namenode-tiger.out/log)
Date: Fri, 17 Apr 2015 08:22:12 +0000
From: [email protected]
To: [email protected]
Subject: ipc.Client: Retrying connect to server
Hello,I have done all steps (as far as I know) to bring up the hadoop. However,
I get the this error
15/04/17 12:45:31 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 0 time(s).
There are a lot of threads and posts regarding this error and I tried them.
However still stuck at this error :(
Can someone help me? What did I wrong?
Here are the configurations:
1) Hadoop configurations[mahmood@tiger hadoop-1.0.2]$ cat
conf/mapred-site.xml<?xml version="1.0"?><?xml-stylesheet type="text/xsl"
href="configuration.xsl"?><!-- Put site-specific property overrides in this
file. --><configuration><property> <name>mapred.job.tracker</name>
<value>localhost:54311</value></property><property>
<name>mapred.child.java.opts</name>
<value>-Xmx512m</value></property></configuration>
[mahmood@tiger hadoop-1.0.2]$ cat conf/core-site.xml<?xml
version="1.0"?><?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Put site-specific property overrides in this file. --><configuration><property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value></property></configuration>
[mahmood@tiger hadoop-1.0.2]$ cat conf/hdfs-site.xml<?xml
version="1.0"?><?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Put site-specific property overrides in this file. --><configuration><property>
<name>dfs.replication</name> <value>1</value></property><property>
<name>hadoop.tmp.dir</name>
<value>/home/mahmood/bigdatabench/apache/hadoop-1.0.2/folders/tmp</value></property><property>
<name>dfs.name.dir</name>
<value>/home/mahmood/bigdatabench/apache/hadoop-1.0.2/folders/name</value></property><property>
<name>dfs.data.dir</name>
<value>/home/mahmood/bigdatabench/apache/hadoop-1.0.2/folders/data</value></property></configuration>
2) Network configuration[root@tiger hadoop-1.0.2]# cat /etc/sysconfig/iptables#
Firewall configuration written by system-config-firewall# Manual customization
of this file is not recommended.*filter:INPUT ACCEPT [0:0]:FORWARD ACCEPT
[0:0]:OUTPUT ACCEPT [0:0]-A INPUT -m state --state ESTABLISHED,RELATED -j
ACCEPT-A INPUT -p icmp -j ACCEPT-A INPUT -i lo -j ACCEPT-A INPUT -m state
--state NEW -m tcp -p tcp --dport 5901 -j ACCEPT-A INPUT -m state --state NEW
-m tcp -p tcp --dport 80 -j ACCEPT-A INPUT -m state --state NEW -m tcp -p tcp
--dport 22 -j ACCEPT-A INPUT -m state --state NEW -m tcp -p tcp --dport 2049 -j
ACCEPT-A INPUT -m state --state NEW -m tcp -p tcp --dport 54310 -j ACCEPT-A
INPUT -m state --state NEW -m tcp -p tcp --dport 54311 -j ACCEPT-A INPUT -j
REJECT --reject-with icmp-host-prohibited-A FORWARD -j REJECT --reject-with
icmp-host-prohibitedCOMMIT
[root@tiger hadoop-1.0.2]# /etc/init.d/iptables restartiptables: Flushing
firewall rules: [ OK ]iptables: Setting chains to
policy ACCEPT: filter [ OK ]iptables: Unloading modules:
[ OK ]iptables: Applying firewall rules:
[ OK ]
[mahmood@tiger hadoop-1.0.2]$ netstat -an | grep 54310[mahmood@tiger
hadoop-1.0.2]$ netstat -an | grep 54311tcp 0 0
::ffff:127.0.0.1:54311 :::* LISTENtcp 426
0 ::ffff:127.0.0.1:54311 ::ffff:127.0.0.1:49639 ESTABLISHEDtcp
0 0 ::ffff:127.0.0.1:49639 ::ffff:127.0.0.1:54311 ESTABLISHED
3) Starting Hadoop[mahmood@tiger hadoop-1.0.2]$ stop-all.shWarning:
$HADOOP_HOME is deprecated.stopping jobtrackerlocalhost: Warning: $HADOOP_HOME
is deprecated.localhost:localhost: stopping tasktrackerno namenode to
stoplocalhost: Warning: $HADOOP_HOME is deprecated.localhost:localhost: no
datanode to stoplocalhost: Warning: $HADOOP_HOME is
deprecated.localhost:localhost: stopping secondarynamenode
[mahmood@tiger hadoop-1.0.2]$ start-all.shWarning: $HADOOP_HOME is
deprecated.starting namenode, logging to
/home/mahmood/bigdatabench/apache/hadoop-1.0.2/libexec/../logs/hadoop-mahmood-namenode-tiger.outlocalhost:
Warning: $HADOOP_HOME is deprecated.localhost:localhost: starting datanode,
logging to
/home/mahmood/bigdatabench/apache/hadoop-1.0.2/libexec/../logs/hadoop-mahmood-datanode-tiger.outlocalhost:
Warning: $HADOOP_HOME is deprecated.localhost:localhost: starting
secondarynamenode, logging to
/home/mahmood/bigdatabench/apache/hadoop-1.0.2/libexec/../logs/hadoop-mahmood-secondarynamenode-tiger.outstarting
jobtracker, logging to
/home/mahmood/bigdatabench/apache/hadoop-1.0.2/libexec/../logs/hadoop-mahmood-jobtracker-tiger.outlocalhost:
Warning: $HADOOP_HOME is deprecated.localhost:localhost: starting tasktracker,
logging to
/home/mahmood/bigdatabench/apache/hadoop-1.0.2/libexec/../logs/hadoop-mahmood-tasktracker-tiger.out
[mahmood@tiger hadoop-1.0.2]$ jps21712 JobTracker21882 TaskTracker21580
SecondaryNameNode21989 Jps
[mahmood@tiger hadoop-1.0.2]$ hadoop dfsadmin -reportWarning: $HADOOP_HOME is
deprecated.15/04/17 12:45:31 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 0 time(s).15/04/17 12:45:32 INFO
ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already
tried 1 time(s).
Regards,
Mahmood