At the current version(2.6.0), /hadoop fs/ is deprecated, use /hdfs dfs/ instead.

Ref:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html

在 2014年12月29日 05:26, Abhishek Singh 写道:

Hello Anil,

There are 2 ways I'm aware of :-

1) use put command

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination filesystem. Also reads input from stdin and writes to destination filesystem.

    hadoop fs -put localfile /user/hadoop/hadoopfile
    hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile <http://nn.example.com/hadoop/hadoopfile> hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile <http://nn.example.com/hadoop/hadoopfile>
    Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error.

2) Create a shell script for your custom need.

To give you a vague idea here's one of the link on stackoverflow which is similar to what you are demanding:-

http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster

Please reach out for further discussion!

Thanks!

Regards,

Abhishek Singh

On Dec 28, 2014 3:52 AM, "Anil Jagtap" <[email protected] <mailto:[email protected]>> wrote:

    Dear All,

    Just wanted to know if there is a way to copy multiple files using
    hadoop fs -put.

    Instead of specifying individual name I provide wild-chars and
    respective files should get copied.

    Thank You.

    Rgds, Anil


--
Best Regards,
Cao Yi, 曹铱
Tel: 189-8052-8753

北京普菲特广告有限公司(成都)

Reply via email to