I would say hdfs dfs … like examples below from remote host 

 

hdfs dfs -mkdir hdfs://rhes564:9000/some_directory

hdfs dfs -put hadoop-hduser-datanode-rhes5.log 
hdfs://rhes564:9000/some_directory

hduser@rhes5::/home/hduser/hadoop/hadoop-2.6.0/logs> hdfs dfs -ls 
hdfs://rhes564:9000/some_directory

Found 1 items

-rw-r--r--   2 hduser supergroup    1274532 2015-04-08 19:25 
hdfs://rhes564:9000/some_directory/hadoop-hduser-datanode-rhes5.log

 

HTH

 

Mich Talebzadeh

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and 
Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Ltd, its subsidiaries 
or their employees, unless expressly so stated. It is the responsibility of the 
recipient to ensure that this email is virus free, therefore neither Peridale 
Ltd, its subsidiaries nor their employees accept any responsibility.

 

From: Liaw, Huat (MTO) [mailto:[email protected]] 
Sent: 08 April 2015 19:08
To: [email protected]
Subject: RE: Unable to load file from local to HDFS cluster

 

Should be hadoop dfs -put

 

From: sandeep vura [mailto:[email protected]] 
Sent: April 8, 2015 1:53 PM
To: [email protected]
Subject: Unable to load file from local to HDFS cluster

 

Hi,

 

When loading a file from local to HDFS cluster using the below command 

 

hadoop fs -put sales.txt /sales_dept.

 

Getting the following exception.Please let me know how to resolve this issue 
asap.Please find the attached is the logs that is displaying on namenode.

 

Regards,

Sandeep.v

Reply via email to