I have found it very helpful to use hadoop fsck /
In these situations. If the files that are problematic are not important for you, then you can exit safe mode explicitly and just delete these files. That will give you a clean file system. Examples of files that might be OK to delete include left-over files from aborted map-reduce jobs or files that you can just re-import into the HDFS from other storage. On 10/16/07 11:17 PM, "cpreethi" <[EMAIL PROTECTED]> wrote: > > Hi > > It has been in safe mode all time. I am not bale to execute jobs too. I find > some error sayin canot delete file. Namenode is in safemode. > > Thanks, > Preethi.C > > > Ned Rockson wrote: >> >> I think it's in safe mode for a short time after startup to check the >> integrity of the file system. >> >> On 10/16/07, Preethi Chockalingam <[EMAIL PROTECTED]> wrote: >>> Hi >>> >>> I have been facing some problems in Hadoop. I find the Namennode to be in >>> safe mode?? May I knw why namenode is in safe mode and what happens when >>> it is in safe mode? >>> >>> Thanks in advance >>> Preethi.C >>> >>> >>> Download prohibited? No problem. CHAT from any browser, without >>> download. Go to http://in.messenger.yahoo.com/webmessengerpromo.php/ >> >>