AFAIK, there is no java API available. Perhaps you could do recursive
directory listing for a path and invokes #setAcl java api for each.
https://hadoop.apache.org/docs/r2.7.2/api/org/apache/hadoop/fs/FileSystem.html#setAcl(org.apache.hadoop.fs.Path,
java.util.List)

Rakesh

On Mon, Sep 19, 2016 at 11:22 AM, Shashi Vishwakarma <
shashi.vish...@gmail.com> wrote:

> Thanks Rakesh.
>
> Just last question, is there any Java API available for recursively
> applying ACL or I need to iterate on all folders of dir and apply acl for
> each?
>
> Thanks
> Shashi
>
> On 19 Sep 2016 9:56 am, "Rakesh Radhakrishnan" <rake...@apache.org> wrote:
>
>> It looks like '/user/test3' has owner '"hdfs" and denying the
>> access while performing operations via "shashi" user. One idea is to
>> recursively set ACL to sub-directories and files as follows:
>>
>>              hdfs dfs -setfacl -R -m default:user:shashi:rwx /user
>>
>>             -R, option can be used to apply operations to all files and
>> directories recursively.
>>
>> Regards,
>> Rakesh
>>
>> On Sun, Sep 18, 2016 at 8:53 PM, Shashi Vishwakarma <
>> shashi.vish...@gmail.com> wrote:
>>
>>> I have following scenario. There is parent folder /user with five child
>>> folder as test1 , test2, test3 etc in HDFS.
>>>
>>>     /user/test1
>>>     /user/test2
>>>     /user/test3
>>>
>>> I applied acl on parent folder to make sure user has automatically
>>> access to child folder.
>>>
>>>      hdfs dfs -setfacl -m default:user:shashi:rwx /user
>>>
>>>
>>> but when i try to put some file , it is giving permission denied
>>> exception
>>>
>>>     hadoop fs -put test.txt  /user/test3
>>>     put: Permission denied: user=shashi, access=WRITE,
>>> inode="/user/test3":hdfs:supergroup:drwxr-xr-x
>>>
>>> **getfacl output**
>>>
>>>     hadoop fs -getfacl /user/test3
>>>     # file: /user/test3
>>>     # owner: hdfs
>>>     # group: supergroup
>>>     user::rwx
>>>     group::r-x
>>>     other::r-x
>>>
>>> Any pointers on this?
>>>
>>> Thanks
>>> Shashi
>>>
>>
>>

Reply via email to