Here are some onfo. I greped for the port numbers instead of LISTEN. Please 
note that I am using Hadoop 1.2.1

$ netstat -an | grep 54310
$ netstat -an | grep 54311
tcp        0      0 ::ffff:127.0.0.1:54311      :::*                        
LISTEN      
tcp        0      0 ::ffff:127.0.0.1:57479      ::ffff:127.0.0.1:54311      
ESTABLISHED 
tcp        0      0 ::ffff:127.0.0.1:54311      ::ffff:127.0.0.1:57479      
ESTABLISHED 


$ hadoop dfsadmin -report
14/03/27 17:35:07 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:54310. Already tried 0 time(s); retry policy is 
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/03/27 17:35:08 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:54310. Already tried 1 time(s); retry policy is 
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)


$ hadoop fsck /
14/03/27 17:36:10 ERROR security.UserGroupInformation: 
PriviledgedActionException as:hadoop cause:java.net.ConnectException: 
Connection refused
Exception in thread "main" java.net.ConnectException: Connection refused
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
    at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:198)
    at 
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:579)
    at java.net.Socket.connect(Socket.java:528)
    at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
    at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
    at sun.net.www.http.HttpClient.New(HttpClient.java:308)
    at sun.net.www.http.HttpClient.New(HttpClient.java:326)
    at 
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:996)
    at 
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:932)
    at 
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:850)
    at 
sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1300)
    at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:142)
    at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:109)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:109)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
    at org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:183)


 
Regards,
Mahmood



On Thursday, March 27, 2014 5:11 PM, John Lilley <[email protected]> 
wrote:
 
Does “netstat -an | grep LISTEN” show these ports being listened on?
 
Can you stat hdfs from the command line e.g.:
 
hdfs dfsadmin -report
hdfs fsck /
hdfs dfs -ls /
 
Also, check out /var/log/hadoop or /var/log/hdfs for more details.
 
john
 
From:Mahmood Naderan [mailto:[email protected]] 
Sent: Thursday, March 27, 2014 5:04 AM
To: [email protected]
Subject: ipc.Client: Retrying connect to server
 
Hi,
I don't know what mistake I did that now I get this error


 
  INFO ipc.Client:Retrying connect toserver:localhost/127.0.0.1:54310.Already 
tried2 time(s);retry policy 
isRetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1SECONDS)
  INFO ipc.Client:Retrying connect toserver:localhost/127.0.0.1:54310.Already 
tried3 time(s);retry policy 
isRetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1SECONDS)




I saw the wiki page for that message and all other resources state that the 
namenode has not been started yet.
However I tried "stop-all.sh && start-all.sh" multiple times. In fact I see 
java processes regarding Hadoop.
 
More info:
 
core-site.xml
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
</property>



mapred-site.xml
<property>
<name>mapred.job.tracker</name>
<value>localhost:54311</value>
</property>


Any more idea on that?


 
Regards,
Mahmood

Reply via email to