Hello!

Quick simple question, hopefully someone out there could answer.

Does the hadoop dfs support putting multiple files at once?

The documentation says -put only works on one file. What's the best way to 
import multiple files in multiple directories (i.e. dir1/file1 dir1/file2 
dir2/file1 dir2/file2 etc)?

End goal would be to do something like: 

    bin/hadoop dfs -put /dir*/file* /myfiles

And a follow-up: bin/hadoop dfs -lsr /myfiles 
would list:

/myfiles/dir1/file1
/myfiles/dir1/file2
/myfiles/dir2/file1
/myfiles/dir2/file2

Thanks again for any input!!!

- chris

Reply via email to