I'm using the Hadoop FS commands to move files from my local machine into
the Hadoop dfs. I'd like a way to force a write to the dfs even if a file of
the same name exists. Ideally I'd like to use a "-force" switch or some
such; e.g.,
    hadoop dfs -copyFromLocal -force adirectory s3n://wholeinthebucket/

Is there a way to do this or does anyone know if this is in the future
Hadoop plans?

Thanks
John SD

Reply via email to