Sebastian from the other posting it looks like you need to change permission 
for `/` in hdfs. Do this with:

    sudo /usr/local/hadoop/bin/hdfs dfs -chmod 777 -R /

this will grant global permissions to all users in Linux to all part of HDFS.

The permission system is very much like ‘Nix OSes so whatever user is running a 
process will have the same permissions in hdfs as in the local fs so if the 
permission to write is not granted to “ubuntu” or “aml” or … then you must 
switch to root and change permissions. 

Read a similar problem here: 
https://community.cloudera.com/t5/CDH-Manual-Installation/Permission-denied-user-mapred-access-WRITE-inode-quot-quot-hdfs/td-p/16318
 
<https://community.cloudera.com/t5/CDH-Manual-Installation/Permission-denied-user-mapred-access-WRITE-inode-quot-quot-hdfs/td-p/16318>

On Jul 4, 2017, at 2:31 PM, Pat Ferrel <[email protected]> wrote:

Sebastian:

Try logging in to the “aml” user. Hadoop HDFS has it’s own permission system 
and conventions similar to ‘Nix OSes. There is `/user/aml/`directory in HDFS 
which acts as the home directory for any user logged in as “aml”. If you `dfs 
-put file` it will go to `/user/<’nix-username>/file` where <’nix-username> is 
the uname for the user logged in. Further you can sudo to change permissions. 
Anyway logging in to the “aml” user is all that should be required since the 
processes created will be under that user’s account. If you need to set 
permissions in the local file system create them for the “aml” account.

Marius:

Yes this is the typical way to install HDFS in the wide world (non-pio)  He is 
using an AWS AMI created by ActionML. 


On Jul 4, 2017, at 1:32 PM, Marius Rabenarivo <[email protected] 
<mailto:[email protected]>> wrote:

It seems like you installed Hadoop using hadoop user account.

Try to re-install it using the user account you intend to use
when running PredictionIO.

2017-07-04 21:54 GMT+04:00 Pat Ferrel <[email protected] 
<mailto:[email protected]>>:
I you are using the AWS AMI from ActionML you should grant access to user “aml” 
since PIO is setup to run with this user.


On Jul 4, 2017, at 9:27 AM, Dan Guja <[email protected] 
<mailto:[email protected]>> wrote:

Hi

Did you try to give write access to /tmp/ for user ubuntu (as the error says) ?

On Tue, Jul 4, 2017 at 10:37 AM, Sebastian Fix <[email protected] 
<mailto:[email protected]>> wrote:
Hey together

Anyone experiencing similar problems?

Exception in thread "main" org.apache.hadoop.security.AccessControlException: 
Permission denied: user=ubuntu, access=WRITE, 
inode="/tmp/AV0N2sqO1Bi_4YGMpU1i-0-als/rank/_temporary/0":hadoop:supergroup:drwxr-xr-x

at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)

..

..

..

Caused by: 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 Permission denied: user=ubuntu, access=WRITE, 
inode="/tmp/AV0N2sqO1Bi_4YGMpU1i-0-als/rank/_temporary/0":hadoop:supergroup:drwxr-xr-x

..

..

A Link to the console error (error2.mp4): 
https://www.dropbox.com/s/0t9laycpkbqpkfr/error2.mp4?dl=0 
<https://www.dropbox.com/s/0t9laycpkbqpkfr/error2.mp4?dl=0>
Cheers, Felix






Reply via email to