Yep, I executed all commands with the same user and didn't see anything
about object privileges.
On Fri, Oct 14, 2016 at 7:53 PM, Goden Yao <goden...@apache.org> wrote:
> check: https://cwiki.apache.org/confluence/display/Hive/
> Down to the bottom , there's a table of permission check.
> - CREATE definitely need to check the WRITE permission of the custom
> - SELECT - should only check if the user has privilege on viewing the
> Did you use the same user to create and select? or it was admin who
> created the table in the first place?
> On Fri, Oct 14, 2016 at 4:08 AM Staņislavs Rogozins <
> stanislavs.rogoz...@gmail.com> wrote:
>> Apparently, in the hive version that I'm dealing with users are required
>> to have WRITE permission on table's data directory to CREATE it or SELECT
>> from it, even if you specify custom LOCATION, or make the table EXTERNAL.
>> Some examples:
>> hdfs dfs -mkdir -p /data/test_perm1
>> CREATE TABLE test_perm1 (col1 STRING) LOCATION '/data/test_perm';
>> hdfs dfs -chmod 550 /data/test_perm1
>> performing `SELECT * FROM test_perm1;` yields something like
>> `Error: Error while compiling statement: FAILED: HiveException
>> java.security.AccessControlException: Permission denied: ..
>> access=WRITE, inode="/data/test_perm1"..`
>> After that, executing
>> CREATE EXTERNAL TABLE test_perm2 (col1 STRING) LOCATION
>> yields a similar error.
>> Why would a SELECT or a CREATE of EXTERNAL table require WRITE
>> permissions?(Does the second one have to do with setting the sticky bit?).
>> Is this intended behaviour? Is this a bug in 0.14? Could some kind of
>> mis-configuration of Hive be behind this?