"hadoop fs -rmr /op"

That command always fails. I am trying to run sequential hadoop jobs.
After the first run all subsequent runs fail while cleaning up ( aka
removing the hadoop dir created by previous run ). What can I do to
avoid this ?

here is my hadoop version :
# hadoop version
Hadoop 0.20.0
Subversion https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.20
-r 763504
Compiled by ndaley on Thu Apr  9 05:18:40 UTC 2009

Any help is greatly appreciated.

-Prasen

Reply via email to