Holger,

This is a bug in SecondaryNamenode. I just verified it.
The problem is that SecondaryNamenode incorrectly deals with the "hdfs:"
prefix in fs.default.name
I also verified that HADOOP-2185 fixes the problem.
Until it is committed you should just modify hadoop-site.xml
Please use
<property>
  <name>fs.default.name</name>
  <value>deri-srvgal32.nuigalway.ie:59310</value>
</property>
instead of what you have now.

This problem should be fixed in hadoop 0.15 and probably in 0.14 so it would
be good to file a jira about it.

Thanks,

--Konstantin

Holger Stenzhorn wrote:
Hello,

I am using the latest Hadoop version from SVN trunk. When starting up Hadoop DFS it first seems that everything works fine. But looking at the log file of the SecondaryNameNode I see after five minutes that an Exception is thrown (see attached log excerpt).
All other log files don't show any exceptions whatsoever.

To avoid clashes with another user running Hadoop on the same cluster I have upped all ports by 500 hundred (see attached hadoop-site.xml).
Is there anything missing in my configuration?

Cheers,
Holger

log file:
-------

2007-11-28 12:32:56,720 INFO org.apache.hadoop.dfs.NameNode.Secondary: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting SecondaryNameNode
STARTUP_MSG:   host = deri-srvgal32/140.203.154.192
STARTUP_MSG:   args = []
************************************************************/
2007-11-28 12:32:56,872 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=SecondaryNameNode, sessionId=null 2007-11-28 12:32:57,062 INFO org.mortbay.util.Credential: Checking Resource aliases 2007-11-28 12:32:57,193 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4 2007-11-28 12:32:57,194 INFO org.mortbay.util.Container: Started HttpContext[/static,/static] 2007-11-28 12:32:57,194 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs] 2007-11-28 12:32:57,846 INFO org.mortbay.util.Container: Started [EMAIL PROTECTED] 2007-11-28 12:32:58,024 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/] 2007-11-28 12:32:58,058 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:55090 2007-11-28 12:32:58,058 INFO org.mortbay.util.Container: Started [EMAIL PROTECTED] 2007-11-28 12:32:58,059 INFO org.apache.hadoop.dfs.NameNode.Secondary: Secondary Web-server up at: 55090 2007-11-28 12:32:58,059 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint Directory:/sindice/data/holger/tmp/dfs/namesecondary 2007-11-28 12:32:58,059 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint Period :3600 secs (60 min) 2007-11-28 12:32:58,059 WARN org.apache.hadoop.dfs.NameNode.Secondary: Log Size Trigger :67108864 bytes (65536 KB) 2007-11-28 12:37:58,151 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: 2007-11-28 12:37:58,152 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.net.UnknownHostException: hdfs
   at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:177)
   at java.net.Socket.connect(Socket.java:519)
   at java.net.Socket.connect(Socket.java:469)
   at sun.net.NetworkClient.doConnect(NetworkClient.java:157)
   at sun.net.www.http.HttpClient.openServer(HttpClient.java:394)
   at sun.net.www.http.HttpClient.openServer(HttpClient.java:529)
   at sun.net.www.http.HttpClient.<init>(HttpClient.java:233)
   at sun.net.www.http.HttpClient.New(HttpClient.java:306)
   at sun.net.www.http.HttpClient.New(HttpClient.java:323)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:788) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:729) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:654) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:977) at org.apache.hadoop.dfs.TransferFsImage.getFileClient(TransferFsImage.java:142) at org.apache.hadoop.dfs.TransferFsImage.getFileClient(TransferFsImage.java:181) at org.apache.hadoop.dfs.SecondaryNameNode.getFSImage(SecondaryNameNode.java:209) at org.apache.hadoop.dfs.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:275) at org.apache.hadoop.dfs.SecondaryNameNode.run(SecondaryNameNode.java:192)
   at java.lang.Thread.run(Thread.java:619)


hadoop-site.xml:
-----------------

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
 <property>
   <name>mapred.child.java.opts</name>
   <value>-Xmx2000m</value>
 </property>
 <property>
   <name>hadoop.tmp.dir</name>
   <value>/sindice/data/holger/tmp</value>
 </property>
 <property>
   <name>fs.default.name</name>
   <value>hdfs://deri-srvgal32.nuigalway.ie:59310</value>
 </property>
 <property>
   <name>mapred.job.tracker</name>
   <value>deri-srvgal32.nuigalway.ie:59311</value>
 </property>
 <property>
   <name>dfs.replication</name>
   <value>3</value>
 </property>
 <property>
   <name>dfs.secondary.info.port</name>
   <value>55090</value>
 </property>
 <property>
   <name>dfs.datanode.port</name>
   <value>55010</value>
 </property>
 <property>
   <name>dfs.info.port</name>
   <value>55070</value>
 </property>
 <property>
   <name>mapred.job.tracker.info.port</name>
   <value>55030</value>
 </property>
 <property>
   <name>tasktracker.http.port</name>
   <value>55060</value>
 </property>
</configuration>

Reply via email to