The permissions look alright if TT too is run by 'hadoopmachine'. Can
you also check if you have adequate space free, reported by df -h
/home/hadoopmachine?

On Tue, Apr 3, 2012 at 10:28 PM, Bas Hickendorff
<hickendorff...@gmail.com> wrote:
> Thanks for your help!
> However, as far as I can see, the user has those rights.
>
> I have in mapred-ste.xml :
>
>   <property>
>      <name>mapred.local.dir</name>
>      <value>/home/hadoopmachine/hadoop_data/mapred</value>
>    <final>true</final>
>   </property>
>
>
> and the directories look like this:
>
> hadoopmachine@debian:~$ cd /home/hadoopmachine/hadoop_data/mapred
> hadoopmachine@debian:~/hadoop_data/mapred$ ls -lah
> total 24K
> drwxr-xr-x 6 hadoopmachine hadoopmachine 4.0K Apr  3 12:11 .
> drwxr-xr-x 6 hadoopmachine hadoopmachine 4.0K Apr  3 08:26 ..
> drwxr-xr-x 2 hadoopmachine hadoopmachine 4.0K Apr  3 12:10 taskTracker
> drwxr-xr-x 2 hadoopmachine hadoopmachine 4.0K Apr  3 12:10 tt_log_tmp
> drwx------ 2 hadoopmachine hadoopmachine 4.0K Apr  3 12:10 ttprivate
> drwxr-xr-x 2 hadoopmachine hadoopmachine 4.0K Apr  3 08:28 userlogs
>
> hadoopmachine@debian:~/hadoop_data/mapred$ cd ..
> hadoopmachine@debian:~/hadoop_data$ ls -lah
> total 24K
> drwxr-xr-x  6 hadoopmachine hadoopmachine 4.0K Apr  3 08:26 .
> drwxr-xr-x 31 hadoopmachine hadoopmachine 4.0K Apr  3 12:08 ..
> drwxr-xr-x  6 hadoopmachine hadoopmachine 4.0K Apr  3 12:10 data
> drwxr-xr-x  6 hadoopmachine hadoopmachine 4.0K Apr  3 12:11 mapred
> drwxr-xr-x  5 hadoopmachine hadoopmachine 4.0K Apr  3 12:09 name
> drwxr-xr-x  4 hadoopmachine hadoopmachine 4.0K Apr  3 10:11 tmp
>
>
> As far as I can see (but my linux permissions knowledge might be
> failing) the user "hadoopmachine" has rights on these folders. I
> confirmed that that user is indeed the user that runs the TaskTracker.
>
> Are there any other things I could check?
>
>
> Regards,
>
> Bas
>
> On Tue, Apr 3, 2012 at 6:12 PM, Harsh J <ha...@cloudera.com> wrote:
>> Some of your TaskTrackers' mapred.local.dirs do not have proper r/w
>> permissions set on them. Make sure they are owned by the user that
>> runs the TT service and have read/write permission at least for that
>> user.
>>
>> On Tue, Apr 3, 2012 at 6:58 PM, Bas Hickendorff
>> <hickendorff...@gmail.com> wrote:
>>> Hello all,
>>>
>>> My map-reduce operation on Hadoop (running on Debian) is correctly
>>> starting and finding the input file. However, just after starting the
>>> map reduce, Hadoop tells me that it cannot find a file. Unfortunately,
>>> it does not state what file it cannot find, or where it is looking.
>>> Does someone now about what file error is? See below for the complete
>>> error.
>>>
>>> Since the java error is in the chmod() function (judging from the
>>> stack in the output), I assume it is a problem with the rights, but
>>> how do I know what rights to change if it gives me no path?
>>>
>>> Thanks in advance,
>>>
>>> Bas
>>>
>>>
>>>
>>>
>>> The output of the job:
>>>
>>>
>>> hadoopmachine@debian:~$ ./hadoop-1.0.1/bin/hadoop jar
>>> hadooptest/main.jar nl.mydomain.hadoop.debian.test.Main
>>> /user/hadoopmachine/input /user/hadoopmachine/output
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> 12/04/03 08:05:08 WARN mapred.JobClient: Use GenericOptionsParser for
>>> parsing the arguments. Applications should implement Tool for the
>>> same.
>>> ****hdfs://localhost:9000/user/hadoopmachine/input
>>> 12/04/03 08:05:08 INFO input.FileInputFormat: Total input paths to process 
>>> : 1
>>> 12/04/03 08:05:08 INFO mapred.JobClient: Running job: job_201204030722_0004
>>> 12/04/03 08:05:09 INFO mapred.JobClient:  map 0% reduce 0%
>>> 12/04/03 08:05:13 INFO mapred.JobClient: Task Id :
>>> attempt_201204030722_0004_m_000002_0, Status : FAILED
>>> Error initializing attempt_201204030722_0004_m_000002_0:
>>> ENOENT: No such file or directory
>>>        at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method)
>>>        at org.apache.hadoop.fs.FileUtil.execSetPermission(FileUtil.java:692)
>>>        at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:647)
>>>        at 
>>> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
>>>        at 
>>> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
>>>        at 
>>> org.apache.hadoop.mapred.JobLocalizer.initializeJobLogDir(JobLocalizer.java:239)
>>>        at 
>>> org.apache.hadoop.mapred.DefaultTaskController.initializeJob(DefaultTaskController.java:196)
>>>        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1226)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:416)
>>>        at 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
>>>        at 
>>> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1201)
>>>        at 
>>> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1116)
>>>        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2404)
>>>        at java.lang.Thread.run(Thread.java:636)
>>>
>>> 12/04/03 08:05:13 WARN mapred.JobClient: Error reading task
>>> outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201204030722_0004_m_000002_0&filter=stdout
>>> 12/04/03 08:05:13 WARN mapred.JobClient: Error reading task
>>> outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201204030722_0004_m_000002_0&filter=stderr
>>
>>
>>
>> --
>> Harsh J



-- 
Harsh J

Reply via email to