hadoop dfs -put will take a directory. If it won't work recursively,
then you can probably bang out a bash script that will handle it using
find(1) and xargs(1).
-- Aaron
Chris Fellows wrote:
Hello!
Quick simple question, hopefully someone out there could answer.
Does the hadoop dfs support putting multiple files at once?
The documentation says -put only works on one file. What's the best way to
import multiple files in multiple directories (i.e. dir1/file1 dir1/file2
dir2/file1 dir2/file2 etc)?
End goal would be to do something like:
bin/hadoop dfs -put /dir*/file* /myfiles
And a follow-up: bin/hadoop dfs -lsr /myfiles
would list:
/myfiles/dir1/file1
/myfiles/dir1/file2
/myfiles/dir2/file1
/myfiles/dir2/file2
Thanks again for any input!!!
- chris