In case anyone wants to know: There was trash in the /tmp dir. I stopped all nodes, formatted the HDFS and then re-started the nodes. That seems to have solved the problem.
2014-10-02 20:58 GMT+02:00 Roger Maillist <[email protected]>: > Hi > For learning purposes, I am trying to set up my own hadoop/hdfs system at > home. I am running openSuse 13 and Hadoop 2.5.1. > > I followed the explanations in the "Singe Node Setup": > > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html > > My problem is, the data node won't start: > > --> "Incompatible clusterIDs" > > How can I fix this? > > Thanks > Roger > > 2014-10-02 20:48:31,022 FATAL > org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for > Block pool <registering> (Datanode Uuid unassigned) service to localhost/ > 127.0.0.1:9000. Exiting. > java.io.IOException: Incompatible clusterIDs in > /tmp/hadoop-roger/dfs/data: namenode clusterID = > CID-4146405c-72b8-404c-944d-58d5453ae939; datanode clusterID = > CID-9280fd26-bee4-430c-8ea5-732a570047db > at > org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:477) > at > org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:226) > at > org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:254) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:975) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:946) > at > org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:278) > at > org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:220) > at > org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:812) > at java.lang.Thread.run(Thread.java:745) >
