adhakrishnan
Cc: "user.hadoop"
Subject: Re: HDFS ACL | Unable to define ACL automatically for child folders
Thanks a lot Rakesh. Above information is very much helpful.
Thanks
Shashi
On Mon, Sep 19, 2016 at 12:39 PM, Rakesh Radhakrishnan
mailto:rake...@apache.org>> wrote:
AFAIK, the
Thanks a lot Rakesh. Above information is very much helpful.
Thanks
Shashi
On Mon, Sep 19, 2016 at 12:39 PM, Rakesh Radhakrishnan
wrote:
> AFAIK, there is no java API available. Perhaps you could do recursive
> directory listing for a path and invokes #setAcl java api for each.
> https://hadoop
AFAIK, there is no java API available. Perhaps you could do recursive
directory listing for a path and invokes #setAcl java api for each.
https://hadoop.apache.org/docs/r2.7.2/api/org/apache/hadoop/fs/FileSystem.html#setAcl(org.apache.hadoop.fs.Path,
java.util.List)
Rakesh
On Mon, Sep 19, 2016 at
Thanks Rakesh.
Just last question, is there any Java API available for recursively
applying ACL or I need to iterate on all folders of dir and apply acl for
each?
Thanks
Shashi
On 19 Sep 2016 9:56 am, "Rakesh Radhakrishnan" wrote:
> It looks like '/user/test3' has owner '"hdfs" and denying the
It looks like '/user/test3' has owner '"hdfs" and denying the access while
performing operations via "shashi" user. One idea is to recursively set ACL
to sub-directories and files as follows:
hdfs dfs -setfacl -R -m default:user:shashi:rwx /user
-R, option can be used to