I think there maybe something like network Configuration error. So, maybe can't resolve hostname.
2008/9/22 Lyle Scott, III <[EMAIL PROTECTED]> > Hello list, I can't seem to fix one error I get when I run the the > wordcount example. Everything seems to map just fine, but I can't seem to > get reduce to work. > [EMAIL PROTECTED] ~/hadoop]$ bin/hadoop jar hadoop-0.18.1-examples.jar > wordcount wc_exampe wc_exampe-out > 08/09/21 18:45:22 INFO mapred.FileInputFormat: Total input paths to process > : 3 > 08/09/21 18:45:22 INFO mapred.FileInputFormat: Total input paths to process > : 3 > 08/09/21 18:45:22 INFO mapred.JobClient: Running job: job_200809211840_0001 > 08/09/21 18:45:23 INFO mapred.JobClient: map 0% reduce 0% > 08/09/21 18:45:41 INFO mapred.JobClient: map 29% reduce 0% > 08/09/21 18:45:43 INFO mapred.JobClient: map 39% reduce 0% > 08/09/21 18:45:44 INFO mapred.JobClient: map 66% reduce 0% > 08/09/21 18:45:46 INFO mapred.JobClient: map 100% reduce 0% > > ... and then it just stops there. > I am using FreeBSD 7 with JDK 1.6 and posted all the logs I could find with > error info. Any ideas? > > > > > http://localhost:50070/logs/hadoop-hadoop-secondarynamenode-fbsd.lylescott.ws.out > 2008-09-21 18:45:21,831 WARN org.apache.hadoop.dfs.Storage: Checkpoint > directory /tmp/hadoop/hadoop-hadoop/dfs/namesecondary is added. > 2008-09-21 18:46:36,886 ERROR org.apache.hadoop.dfs.NameNode.Secondary: > Exception in doCheckpoint: > 2008-09-21 18:46:36,887 ERROR org.apache.hadoop.dfs.NameNode.Secondary: > java.net.ConnectException: Operation timed out > at java.net.PlainSocketImpl.socketConnect(Native Method) > at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333) > at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:193) > at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182) > at java.net.Socket.connect(Socket.java:519) > at java.net.Socket.connect(Socket.java:469) > at sun.net.NetworkClient.doConnect(NetworkClient.java:157) > at sun.net.www.http.HttpClient.openServer(HttpClient.java:394) > at sun.net.www.http.HttpClient.openServer(HttpClient.java:529) > at sun.net.www.http.HttpClient.<init>(HttpClient.java:233) > at sun.net.www.http.HttpClient.New(HttpClient.java:306) > at sun.net.www.http.HttpClient.New(HttpClient.java:323) > at > sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:788) > at > sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:729) > at > sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:654) > at > sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:977) > at > org.apache.hadoop.dfs.TransferFsImage.getFileClient(TransferFsImage.java:150) > at > org.apache.hadoop.dfs.SecondaryNameNode.downloadCheckpointFiles(SecondaryNameNode.java:247) > at > org.apache.hadoop.dfs.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:304) > at > org.apache.hadoop.dfs.SecondaryNameNode.run(SecondaryNameNode.java:216) > at java.lang.Thread.run(Thread.java:619) > > > > http://localhost:50070/logs/hadoop-hadoop-secondarynamenode-fbsd.lylescott.ws.out > 2008-09-21 18:45:21,831 WARN org.apache.hadoop.dfs.Storage: Checkpoint > directory /tmp/hadoop/hadoop-hadoop/dfs/namesecondary is added. > 2008-09-21 18:46:36,886 ERROR org.apache.hadoop.dfs.NameNode.Secondary: > Exception in doCheckpoint: > 2008-09-21 18:46:36,887 ERROR org.apache.hadoop.dfs.NameNode.Secondary: > java.net.ConnectException: Operation timed out > at java.net.PlainSocketImpl.socketConnect(Native Method) > at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333) > at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:193) > at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182) > at java.net.Socket.connect(Socket.java:519) > at java.net.Socket.connect(Socket.java:469) > at sun.net.NetworkClient.doConnect(NetworkClient.java:157) > at sun.net.www.http.HttpClient.openServer(HttpClient.java:394) > at sun.net.www.http.HttpClient.openServer(HttpClient.java:529) > at sun.net.www.http.HttpClient.<init>(HttpClient.java:233) > at sun.net.www.http.HttpClient.New(HttpClient.java:306) > at sun.net.www.http.HttpClient.New(HttpClient.java:323) > at > sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:788) > at > sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:729) > at > sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:654) > at > sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:977) > at > org.apache.hadoop.dfs.TransferFsImage.getFileClient(TransferFsImage.java:150) > at > org.apache.hadoop.dfs.SecondaryNameNode.downloadCheckpointFiles(SecondaryNameNode.java:247) > at > org.apache.hadoop.dfs.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:304) > at > org.apache.hadoop.dfs.SecondaryNameNode.run(SecondaryNameNode.java:216) > at java.lang.Thread.run(Thread.java:619) > > > http://localhost:50070/logs/hadoop-hadoop-tasktracker-fbsd.lylescott.ws.log > 2008-09-21 18:45:43,172 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_m_000000_0 1.0% > hdfs://localhost:54310/user/hadoop/gutenberg/4300.txt:0+1573044 > 2008-09-21 18:45:43,187 INFO org.apache.hadoop.mapred.TaskTracker: Task > attempt_200809211840_0001_m_000000_0 is done. > 2008-09-21 18:45:45,585 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_m_000002_0 1.0% > hdfs://localhost:54310/user/hadoop/gutenberg/20417.txt:0+674762 > 2008-09-21 18:45:45,590 INFO org.apache.hadoop.mapred.TaskTracker: Task > attempt_200809211840_0001_m_000002_0 is done. > 2008-09-21 18:45:46,150 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:46:22,213 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:46:58,313 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:47:34,371 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:48:10,446 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:48:13,456 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:48:49,535 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:48:55,543 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:49:01,561 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > 2008-09-21 18:49:04,570 INFO org.apache.hadoop.mapred.TaskTracker: > attempt_200809211840_0001_r_000000_0 0.0% reduce > copy > > <........ keeps on scrolling! ............> > > -- Sorry for my English!! 明 Please help me correct my English expression and error in syntax
