Can you try with dfs/ without quotes?If using pig to run jobs you can use rmf within your script(again w/o quotes) to force remove and avoid error if file/dir not present.Or if doing this inside hadoop job, you can use FileSystem/FileStatus to delete directories.HTH. Cheers, /R
On 1/19/10 10:15 AM, "prasenjit mukherjee" <[email protected]> wrote: "hadoop fs -rmr /op" That command always fails. I am trying to run sequential hadoop jobs. After the first run all subsequent runs fail while cleaning up ( aka removing the hadoop dir created by previous run ). What can I do to avoid this ? here is my hadoop version : # hadoop version Hadoop 0.20.0 Subversion https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.20 -r 763504 Compiled by ndaley on Thu Apr 9 05:18:40 UTC 2009 Any help is greatly appreciated. -Prasen
