$ sudo rm /var/lib/hadoop-0.20/cache/hdfs/dfs/data And then start DN again regularly, all should be ok.
On Thu, Feb 2, 2012 at 5:36 AM, Vijayakumar Ramdoss <[email protected]> wrote: > Hi All, > > When I am try to start the datanode, namenode and secondarynodes, its > throwing the org.apache.hadoop.security.UserGroupInformation: > PriviledgedActionException as:hdfs error messages. > > I have attached the log files here. > > > > hadoop-hdfs-namenode-ubuntu.log > ------------------------------- > 2012-02-01 18:49:31,622 ERROR > org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException > as:hdfs (auth:SIMPLE) > cause:org.apache.hadoop.hdfs.server.namenode.SafeModeException: Checkpoint > not created. Name node is in safe mode. > The number of live datanodes 0 needs an additional 1 live datanodes to > reach the minimum number 1. Safe mode will be turned off automatically. > 2012-02-01 18:49:31,622 INFO org.apache.hadoop.ipc.Server: IPC Server > handler 8 on 8020, call rollEditLog() from 127.0.0.1:47594: error: > org.apache.hadoop.hdfs.server.namenode.SafeModeException: Checkpoint not > created. Name node is in safe mode. > The number of live datanodes 0 needs an additional 1 live datanodes to > reach the minimum number 1. Safe mode will be turned off automatically. > org.apache.hadoop.hdfs.server.namenode.SafeModeException: Checkpoint not > created. Name node is in safe mode. > The number of live datanodes 0 needs an additional 1 live datanodes to > reach the minimum number 1. Safe mode will be turned off automatically. > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.rollEditLog(FSNamesystem.java:5095) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.rollEditLog(NameNode.java:877) > at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1434) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1430) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1157) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1428) > > hadoop-root-datanode-ubuntu.log > ------------------------------- > 2012-02-01 18:49:22,635 INFO > org.apache.hadoop.security.UserGroupInformation: JAAS Configuration already > set up for Hadoop, not re-installing. > 2012-02-01 18:49:22,822 INFO > org.apache.hadoop.security.UserGroupInformation: JAAS Configuration already > set up for Hadoop, not re-installing. > 2012-02-01 18:49:23,129 ERROR > org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: > Incompatible namespaceIDs in /var/lib/hadoop-0.20/cache/hdfs/dfs/data: > namenode namespaceID = 470535428; datanode namespaceID = 1304806298 > at > org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:238) > at > org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:153) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:410) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:305) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1606) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1546) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1564) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1690) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1707) > > 2012-02-01 18:49:23,131 INFO > org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: > /************************************************************ > SHUTDOWN_MSG: Shutting down DataNode at ubuntu/127.0.1.1 > > > > Thanks and Regards > Vijay > > [email protected] -- Harsh J Customer Ops. Engineer Cloudera | http://tiny.cloudera.com/about
