Fs -count throws AccessControlException when user not owned files/directories 
are present in the user's directory  . 
---------------------------------------------------------------------------------------------------------------------

                 Key: HDFS-452
                 URL: https://issues.apache.org/jira/browse/HDFS-452
             Project: HDFS
          Issue Type: Bug
    Affects Versions: 0.20.1
            Reporter: Ravi Phulari



If there are any files/directories owned by other user are present in users 
directory then fs -count  operation on users directory throws  Access control 
exception .

e.g 

{code}
[rphul...@some-host ~]$ hadoop fs -ls /user/ | grep rphulari
drwxr--r--   - rphulari    users            0 2009-06-09 22:05 /user/rphulari

[rphul...@some-host ~]$ hadoop fs -ls
Found 3 items
drwx------   - hdfs     users          0 2009-04-17 01:11 /user/rphulari/temp
drwxr--r--   - rphulari users          0 2009-05-06 22:02 /user/rphulari/temp2
-rw-r--r--   3 rphulari users          0 2009-05-06 22:11 /user/rphulari/test

[rphul...@some-host ~]$ hadoop fs -count /user/rphulari
count: org.apache.hadoop.security.AccessControlException: Permission denied: 
user=rphulari, access=READ_EXECUTE, inode="temp":hdfs:users:rwx------
{code}

Ideal out put  should be quota information about user owned dir/files and  
error notification about files/dir which are not owned by user.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to