When ls the hdfs, It seems for some folders the owner is hdfs, so if ssh to 
hdfs, it can run the wordcount example.
 
Any suggestion, thanks.
 
Date: Fri, 2 May 2014 10:44:32 -0400
Subject: Re: Wordcount file cannot be located
From: [email protected]
To: [email protected]

Please add below to your config - for some reason hadoop-common jar is being 
overwritten - please share your feedback - thanks

config.set("fs.hdfs.impl",org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()


On Fri, May 2, 2014 at 12:08 AM, Alex Lee <[email protected]> wrote:




I tried to add the code, but seems still not working.
http://postimg.org/image/6c1dat3jx/
 
2014-05-02 11:56:06,780 WARN  [main] util.NativeCodeLoader 
(NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for 
your platform... using builtin-java classes where applicable

java.io.IOException: No FileSystem for scheme: hdfs
 
Also, the eclipse DFS location can reach the /tmp/ but cannot enter the /user/
 
Any suggestion, thanks.
 
alex
 
From: [email protected]

Date: Fri, 2 May 2014 08:43:26 +0530
Subject: Re: Wordcount file cannot be located
To: [email protected]


Try this along with your MapReduce source code





Configuration config = new Configuration();config.set("fs.defaultFS", 
"hdfs://IP:port/");




FileSystem dfs = FileSystem.get(config);Path path = new Path("/tmp/in");





Let me know your thoughts.




-- 
Thanks & Regards

Unmesha Sreeveni U.B
Hadoop, Bigdata Developer





Center for Cyber Security | Amrita Vishwa Vidyapeetham
http://www.unmeshasreeveni.blogspot.in/









                                          

                                          

Reply via email to