Seems like you do not have "/user/MyId/input/conf" on HDFS.

Try this.

cd $HADOOP_HOME_DIR (this should be your hadoop root dir)
hadoop fs -put conf input/conf

And then run the MR job again.

-Prashant Kommireddi

On Fri, Dec 23, 2011 at 3:40 PM, Pat Flaherty <[email protected]> wrote:

> Hi,
>
> Installed 0.22.0 on CentOS 5.7.  I can start dfs and mapred and see their
> processes.
>
> Ran the first grep example: bin/hadoop jar hadoop-*-examples.jar grep
> input output 'dfs[a-z.]+'.  It seems the correct jar name is
> hadoop-mapred-examples-0.22.0.jar - there are no other hadoop*examples*.jar
> files in HADOOP_HOME.
>
> Didn't work.  Then found and tried pi (compute pi) - that works, so my
> installation is to some degree of approximation good.
>
> Back to grep.  It fails with
>
> > java.io.FileNotFoundException: File does not exist: /user/MyId/input/conf
>
> Found and ran bin/hadoop fs -ls.  OK these directory names are internal to
> hadoop (I assume) because Linux has no idea of /user.
>
> And the directory is there - but the program is failing.
>
> Any suggestions; where to start; etc?
>
> Thanks - Pat
>

Reply via email to