Make sure /user/smulcahy exists in HDFS.  Also, make sure that
/hadoop/mapred/system in HDFS is 733 and owned by hadoop:supergroup.

Let me know if this doesn't work for you.  Also, what version of Hadoop are
you running?

Hope this helps!

Alex

On Mon, Jun 29, 2009 at 1:11 AM, stephen mulcahy
<[email protected]>wrote:

> Alex Loddengaard wrote:
>
>> Have you tried to run the example job as the superuser?  It seems like
>> this
>> might be an issue where hadoop.tmp.dir doesn't have the correctly
>> permissions.  hadoop.tmp.dir and dfs.data.dir should be owned by the unix
>> user running your Hadoop daemons and owner-writtable and readable.
>>
>> Can you confirm this is the case?  Thanks,
>>
>
> Hi Alex,
>
> The RandomWriter example runs without any problems when run as the hadoop
> user (i.e. the superuser / user that runs the hadoop daemons).
>
> hadoop.tmp.dir permissions
>
> smulc...@hadoop01:~$ ls -la /data1/hadoop-tmp/
> total 16
> drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 14:01 .
> drwxr-xr-x 5 root   root   4096 2009-06-19 10:12 ..
> drwxr-xr-x 4 hadoop hadoop 4096 2009-06-19 10:16 dfs
> drwxr-xr-x 3 hadoop hadoop 4096 2009-06-19 10:49 mapred
>
>
>
> smulc...@hadoop01:~$ ls -la /data?/hdfs
> /data1/hdfs:
> total 8
> drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
> drwxr-xr-x 5 root   root   4096 2009-06-19 10:12 ..
>
> /data2/hdfs:
> total 8
> drwxr-xr-x 2 hadoop hadoop 4096 2009-06-19 10:12 .
> drwxr-xr-x 4 root   root   4096 2009-06-19 10:12 ..
>
> Does hadoop.tmp.dir need to be writeable by all users running hadoop jobs?
>
>
> -stephen
>
> --
> Stephen Mulcahy, DI2, Digital Enterprise Research Institute,
> NUI Galway, IDA Business Park, Lower Dangan, Galway, Ireland
> http://di2.deri.ie    http://webstar.deri.ie    http://sindice.com
>

Reply via email to