On Nov 9, 2010, at 1:29 PM, Steve Lewis wrote:

> Using a copy of the Cloudera  security-enabled CDH3b3, we installed vanilla
> hadoop in /home/www/hadoop
> 
> Now when a try to run a job as me I get permission errors -
> I am not even sure if the error is in writing to local files or hdfs or
> where staging is but I need to set permissions to allow the job to work
> 
> Any bright ideas

        This is an HDFS permissions errors.  The top of the stack trace tells 
you pretty much what is going on:

> 10/11/09 12:58:04 WARN conf.Configuration: mapred.task.id is deprecated.
> Instead, use mapreduce.task.attempt.id
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=slewis, access=WRITE, inode="staging":www:supergroup:rwxr-xr-x

User slewis can't write to the staging directory because 

- dir is owned by www
- the dir permissions prevent anyone else to write

If you aren't sure where the staging dir, check the hdfs audit log information.

Reply via email to