You can probably use hadoop fs - chmod <permission> <filename> as suggested above. You can provide r/w permissions as you provide for general unix files. Can you please share your experiences on this thing ?
Thanks, Praveenesh On Wed, Feb 22, 2012 at 4:37 PM, Ben Smithers <smithers....@googlemail.com>wrote: > Hi Shreya, > > A permissions guide for HDFS is available at: > http://hadoop.apache.org/common/docs/current/hdfs_permissions_guide.html > > The permissions system is much the same as unix-like systems with users and > groups. Though I have not worked with this, I think it is likely that all > permissions will need to be applied after putting files into HDFS. > > Hope that helps, > > Ben > > On 22 February 2012 10:41, <shreya....@cognizant.com> wrote: > > > > > > > Hi > > > > > > > > > > > > I want to implement security at file level in Hadoop, essentially > > restricting certain data to certain users. > > > > Ex - File A can be accessed only by a user X > > > > File B can be accessed by only user X and user Y > > > > > > > > Is this possible in Hadoop, how do we do it? At what level are these > > permissions applied (before copying to HDFS or after putting in HDFS)? > > > > When the file gets replicated does it retain these permissions? > > > > > > > > Thanks > > > > Shreya > > > > > > This e-mail and any files transmitted with it are for the sole use of the > > intended recipient(s) and may contain confidential and privileged > > information. > > If you are not the intended recipient, please contact the sender by reply > > e-mail and destroy all copies of the original message. > > Any unauthorized review, use, disclosure, dissemination, forwarding, > > printing or copying of this email or any action taken in reliance on this > > e-mail is strictly prohibited and may be unlawful. > > >