Hi, My datanode and jobtracker are started by user "hadoop". And user "Test" needs to submit the job. So if the user "Test" copies file to HDFS, there is a permission error. /usr/local/hadoop/bin/hadoop dfs -copyFromLocal /home/Test/somefile.txt myapps copyFromLocal: org.apache.hadoop.fs.permission.AccessControlException: Permission denied: user=Test, access=WRITE, inode="user":hadoop:supergroup:rwxr-xr-x Could you please let me know how other users (other than hadoop) can access HDFS and then submit MapReduce jobs. Where to configure or what default configuration needs to be changed.
Thanks, Senthil