A few things may help

   - delete individual files under /op
   - open another terminal

I don't know why, but it helps, and then the error goes away

On Mon, Jan 18, 2010 at 10:45 PM, prasenjit mukherjee
<[email protected]>wrote:

> "hadoop fs -rmr /op"
>
> That command always fails. I am trying to run sequential hadoop jobs.
> After the first run all subsequent runs fail while cleaning up ( aka
> removing the hadoop dir created by previous run ). What can I do to
> avoid this ?
>
> here is my hadoop version :
> # hadoop version
> Hadoop 0.20.0
> Subversion
> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.20
> -r 763504
> Compiled by ndaley on Thu Apr  9 05:18:40 UTC 2009
>
> Any help is greatly appreciated.
>
> -Prasen
>

Reply via email to