Hello Everyone, At time I get following error,when i restart my cluster desktops.(Before that I shutdown mapred and dfs properly though). Temp folder contains of the directory its looking for.Still I get this error. Only solution I found to get rid with this error is I have to format my dfs entirely and then load the data again. and start whole process.
But in that I loose my data on HDFS and I have to reload it. Does anyone has any clue abt it? Error from log fil e:- 2009-04-14 19:40:29,963 INFO org.apache.hadoop.dfs.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = Semantic002/192.168.1.133 STARTUP_MSG: args = [] STARTUP_MSG: version = 0.18.3 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.18 -r 736250; compiled by 'ndaley' on Thu Jan 22 23:12:08 UTC 2009 ************************************************************/ 2009-04-14 19:40:30,958 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=NameNode, port=9000 2009-04-14 19:40:30,996 INFO org.apache.hadoop.dfs.NameNode: Namenode up at: Semantic002/192.168.1.133:9000 2009-04-14 19:40:31,007 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=NameNode, sessionId=null 2009-04-14 19:40:31,014 INFO org.apache.hadoop.dfs.NameNodeMetrics: Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullCont ext 2009-04-14 19:40:31,160 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=hadoop,hadoop,adm,dialout,fax,cdrom,floppy,tape,audio,dip,plugdev,scanner,fuse,admin 2009-04-14 19:40:31,161 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup 2009-04-14 19:40:31,161 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true 2009-04-14 19:40:31,183 INFO org.apache.hadoop.dfs.FSNamesystemMetrics: Initializing FSNamesystemMeterics using context object:org.apache.hadoop.metrics.spi. NullContext 2009-04-14 19:40:31,184 INFO org.apache.hadoop.fs.FSNamesystem: Registered FSNamesystemStatusMBean 2009-04-14 19:40:31,248 INFO org.apache.hadoop.dfs.Storage: Storage directory /tmp/hadoop-hadoop/dfs/name does not exist. 2009-04-14 19:40:31,251 ERROR org.apache.hadoop.fs.FSNamesystem: FSNamesystem initialization failed. org.apache.hadoop.dfs.InconsistentFSStateException: Directory /tmp/hadoop-hadoop/dfs/name is in an inconsistent state: storage directory does not exist or is not accessible. at org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:211) at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80) at org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:294) at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:273) at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:148) at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:193) at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:179) at org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:830) at org.apache.hadoop.dfs.NameNode.main(NameNode.java:839) 2009-04-14 19:40:31,261 INFO org.apache.hadoop.ipc.Server: Stopping server on 9000 2009-04-14 19:40:31,262 ERROR org.apache.hadoop.dfs.NameNode: org.apache.hadoop.dfs.InconsistentFSStateException: Directory /tmp/hadoop-hadoop/dfs/name is in an inconsistent state: storage directory does not exist or is not accessible. at org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:211) at org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80) at org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:294) at org.apache.hadoop.dfs.FSNamesystem.<init>(FSNamesystem.java:273) at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:148) at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:193) at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:179) at org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:830) at org.apache.hadoop.dfs.NameNode.main(NameNode.java:839) 2009-04-14 19:40:31,267 INFO org.apache.hadoop.dfs.NameNode: SHUTDOWN_MSG: /************************************************************ : Thanks Pankil