You must remove the existing output directory before running the job. This
check is put in to prevent you from inadvertently destroying or muddling
your existing output data.

You can remove the output directory in advance programmatically with code
similar to:

FileSystem fs = FileSystem.get(conf); // use your JobConf here
fs.delete(new Path("/path/to/output/dir"), true);

See
http://hadoop.apache.org/core/docs/current/api/org/apache/hadoop/fs/FileSystem.htmlfor
more details.

- Aaron


On Mon, Mar 30, 2009 at 9:25 PM, some speed <[email protected]> wrote:

> Hello everyone,
>
> Is it necessary to redirect the ouput of reduce to a file? When I am trying
> to run the same M-R job more than once, it throws an error that the output
> file already exists. I dont want to use command line args so I hard coded
> the file name into the program.
>
> So, Is there a way , I could delete a file on HDFS programatically?
> or can i skip setting a output file path n just have my output print to
> console?
> or can I just append to an existing file?
>
>
> Any help is appreciated. Thanks.
>
> -Sharath
>

Reply via email to