Files View accesses HDFS as the current Ambari user that's logged in. The output below (assuming that's coming from Files view) shows that you were logged in as "admin" in Ambari and tried to delete files/dirs belonging to "root" with 755 permissions. This will not work as the "admin" user does not have any superuser permission in HDFS (from HDFS's perspective, it's just a regular user named "admin")
Here's a workaround that I can think of: If you want to use Files View to manage all files/dirs in HDFS, you can create a user called "hdfs" in Ambari, and log into Ambari as that user. Then when you operate on HDFS files/dirs with Files View. Yusaku From: Sumit Mohanty Reply-To: "[email protected]<mailto:[email protected]>" Date: Tuesday, July 11, 2017 at 10:45 PM To: "[email protected]<mailto:[email protected]>" Subject: Re: proper Ambari permissions Oh! these are files in HDFS. You can delete them through HDFS command line after logging as the hdfs user. Something like su hdfs hdfs dfs -ls /user/ drwxrwx--- - ambari-qa hdfs 0 2017-07-12 03:37 /user/ambari-qa hdfs dfs -rmr /user/ambari-qa I am not familiar with the usage of File View but looks like creating and logging in as user "hdfs" should work. ________________________________ From: Adaryl Wakefield <[email protected]<mailto:[email protected]>> Sent: Tuesday, July 11, 2017 9:57 PM To: [email protected]<mailto:[email protected]> Subject: RE: proper Ambari permissions Actually they aren’t even all files. I can’t blow away directories either. The files that I do have are the sample salaries data you can get from doing the file management tutorial from Hortonworks. wget https://raw.githubusercontent.com/hortonworks/data-tutorials/893ba0221e2c76c91e9e2baa030323a42abcdf09/tutorials/hdp/hdp-2.5/manage-files-on-hdfs-via-cli-ambari-files-view/assets/sf-salary-datasets/sf-salaries-2011-2013.csv Below is the error I get when I try to delete stuff from the GUI: permission denied: user=admin, access=WRITE, inode="/user/hadoop/sf-salaries":root:hdfs:drwxr-xr-x Permission denied: user=admin, access=WRITE, inode="/user/hadoop/sf-salaries-2011-2013/sf-salaries-2011-2013.csv":root:hdfs:drwxr-xr-x Adaryl "Bob" Wakefield, MBA Principal Mass Street Analytics, LLC 913.938.6685 www.massstreet.net<http://www.massstreet.net/> www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba> Twitter: @BobLovesData<http://twitter.com/BobLovesData> From: Sumit Mohanty [mailto:[email protected]] Sent: Tuesday, July 11, 2017 11:04 PM To: [email protected]<mailto:[email protected]> Subject: Re: proper Ambari permissions Can you provide example of some of the files? In general Ambari runs as root unless configured with a custom user. Some of the files it manages may be created as the service user (say HDFS data directories are owned by HDFS service user). -Sumit ________________________________ From: Adaryl Wakefield <[email protected]<mailto:[email protected]>> Sent: Tuesday, July 11, 2017 8:30 PM To: [email protected]<mailto:[email protected]> Subject: proper Ambari permissions When I’m working in Ambari, sometimes I can’t manage files because whatever user I’m working under doesn’t have permission. 1. What account does Ambari use when it is interacting with the various other programs? 2. How do I need to set my permissions so that when I’m in Ambari as admin, I’m able to create and blow away things at will? Adaryl "Bob" Wakefield, MBA Principal Mass Street Analytics, LLC 913.938.6685 www.massstreet.net<http://www.massstreet.net/> www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba> Twitter: @BobLovesData<http://twitter.com/BobLovesData>
