Yes, it does, and it contains the file with input data (file is called "in").
hadoopmachine@debian:~/hadoop-1.0.1$ bin/hadoop fs -ls Warning: $HADOOP_HOME is deprecated. Found 1 items drwxr-xr-x - hadoopmachine supergroup 0 2012-04-03 07:11 /user/hadoopmachine/input hadoopmachine@debian:~/hadoop-1.0.1$ bin/hadoop fs -ls input Warning: $HADOOP_HOME is deprecated. Found 1 items -rw-r--r-- 1 hadoopmachine supergroup 74 2012-04-02 10:08 /user/hadoopmachine/input/in Is this correct? Regards, Bas On Tue, Apr 3, 2012 at 7:12 PM, Serge Blazhievsky <serge.blazhiyevs...@nice.com> wrote: > Does this directory exist in HDFS > > > /user/hadoopmachine/input > > ??? > > > Serge Blazhievsky > > > On 4/3/12 6:28 AM, "Bas Hickendorff" <hickendorff...@gmail.com> wrote: > >>Hello all, >> >>My map-reduce operation on Hadoop (running on Debian) is correctly >>starting and finding the input file. However, just after starting the >>map reduce, Hadoop tells me that it cannot find a file. Unfortunately, >>it does not state what file it cannot find, or where it is looking. >>Does someone now about what file error is? See below for the complete >>error. >> >>Since the java error is in the chmod() function (judging from the >>stack in the output), I assume it is a problem with the rights, but >>how do I know what rights to change if it gives me no path? >> >>Thanks in advance, >> >>Bas >> >> >> >> >>The output of the job: >> >> >>hadoopmachine@debian:~$ ./hadoop-1.0.1/bin/hadoop jar >>hadooptest/main.jar nl.mydomain.hadoop.debian.test.Main >>/user/hadoopmachine/input /user/hadoopmachine/output >>Warning: $HADOOP_HOME is deprecated. >> >>12/04/03 08:05:08 WARN mapred.JobClient: Use GenericOptionsParser for >>parsing the arguments. Applications should implement Tool for the >>same. >>****hdfs://localhost:9000/user/hadoopmachine/input >>12/04/03 08:05:08 INFO input.FileInputFormat: Total input paths to >>process : 1 >>12/04/03 08:05:08 INFO mapred.JobClient: Running job: >>job_201204030722_0004 >>12/04/03 08:05:09 INFO mapred.JobClient: map 0% reduce 0% >>12/04/03 08:05:13 INFO mapred.JobClient: Task Id : >>attempt_201204030722_0004_m_000002_0, Status : FAILED >>Error initializing attempt_201204030722_0004_m_000002_0: >>ENOENT: No such file or directory >> at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method) >> at org.apache.hadoop.fs.FileUtil.execSetPermission(FileUtil.java:692) >> at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:647) >> at >>org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.j >>ava:509) >> at >>org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344 >>) >> at >>org.apache.hadoop.mapred.JobLocalizer.initializeJobLogDir(JobLocalizer.jav >>a:239) >> at >>org.apache.hadoop.mapred.DefaultTaskController.initializeJob(DefaultTaskCo >>ntroller.java:196) >> at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1226) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:416) >> at >>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation. >>java:1093) >> at >>org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1201) >> at >>org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1116) >> at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2404) >> at java.lang.Thread.run(Thread.java:636) >> >>12/04/03 08:05:13 WARN mapred.JobClient: Error reading task >>outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_2012 >>04030722_0004_m_000002_0&filter=stdout >>12/04/03 08:05:13 WARN mapred.JobClient: Error reading task >>outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_2012 >>04030722_0004_m_000002_0&filter=stderr >