It shouldn't create any issue, make sure you have all the dependent files available on HDFS.
Thanks Jitendra On Fri, Jan 31, 2014 at 4:52 PM, Jyoti Yadav <[email protected]>wrote: > Thanks Jitendra.. > If i format, will it create any side effect to giraph job execution??I > think it wont create any problem but I am getting little bit scared. > > Thanks > > > On Fri, Jan 31, 2014 at 4:22 PM, Jitendra Yadav < > [email protected]> wrote: > >> Hi Jyoti, >> >> That's right you will lose all the HDFS data, therefore you need take >> backup of your critical data from HDFS to some other place. If you are >> using Logical volumes then it would better to add more space on the >> particular volume/mount point. >> >> Regards >> Jitendra >> >> >> On Fri, Jan 31, 2014 at 4:10 PM, Jyoti Yadav >> <[email protected]>wrote: >> >>> Hi folks.. >>> >>> I have some doubt while thinking of formatting of namenode.. >>> Actually i am doing my project in Apache Giraph which makes use of >>> hadoop environment to run the job.. >>> >>> While running various job,i noticed that >>> >>> /app/hadoop/tmp/dfs/name/data >>> directory is almost full.. >>> >>> Should i format my namenode?? Will it create any problem?? >>> I know if i format ,i will lose all my data residing in hdfs. >>> Before formatting it,i will take backup of all the input files used to >>> run giraph job.. >>> >>> Seeking your suggestions.. >>> Thanks >>> >> >> >
