Hi Alex, Your hadoop program configuration is looking into local filesystem directory
By default core-site.xml points to local file system fs.default.namefile:/// instead of : file:/tmp/in , if file resides on hdfs, please point fs.default.name to hdfs Configuration conf = getConf(); conf.set(“fs.default.name”, “hdfs:///localhost.localdomain:8020/”); also this can be configured in core-site.xml please update with appropriate value core-site.xml for e.g <property> <name>fs.default.name</name> <value>hdfs://localhost:54310</value> <description>The name of the default file system. A URI whose scheme and authority determine the FileSystem implementation. The uri’s scheme determines the config property (fs.SCHEME.impl) naming the FileSystem implementation class. The uri’s authority is used to determine the host, port, etc. for a filesystem.</description> Thanks, Hardik On Thu, May 1, 2014 at 9:12 AM, Alex Lee <[email protected]> wrote: > I am using eclipse(kepler) to run the wordcount example on hadoop 2.2 with > plugin 2.2. > > I am trying Run as Configuration, and the Arguments is /tmp/in /tmp/out > > The console always said: > 2014-05-01 21:05:46,280 ERROR [main] security.UserGroupInformation > (UserGroupInformation.java:doAs(1494)) - PriviledgedActionException as:root > (auth:SIMPLE) > cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input > path does not exist: file:/tmp/in > org.apache.hadoop.mapreduce.lib.input.InvalidInputException:* Input path > does not exist*: file:/tmp/in > > Pls see the screenshot > http://postimg.org/image/jtpo6nox3/ > > Both hadoop command and DFS Locations can find the file. > > Any suggestions, thanks. > > Alex > >
