John, I also couldn't find a way from console,
Maybe you already know and don't prefer to use, but API solves this problem.
FileSystem.copyFromLocalFile(boolean delSrc, boolean overwrite, Path
src, Path dst)

If you have to use console, long solution, but you can create a jar
for this, and call it just like hadoop calls FileSystem class in
"hadoop" file in bin directory.

I think File System API also needs some improvement. I wonder if it's
considered by head developers.

Hope this helps,
Rasit

2009/2/4 S D <sd.codewarr...@gmail.com>:
> I'm using the Hadoop FS commands to move files from my local machine into
> the Hadoop dfs. I'd like a way to force a write to the dfs even if a file of
> the same name exists. Ideally I'd like to use a "-force" switch or some
> such; e.g.,
>    hadoop dfs -copyFromLocal -force adirectory s3n://wholeinthebucket/
>
> Is there a way to do this or does anyone know if this is in the future
> Hadoop plans?
>
> Thanks
> John SD
>



-- 
M. Raşit ÖZDAŞ

Reply via email to