Re: Hadoop setup questions

2009-02-13 Thread Rasit OZDAS
I agree with Amar and James, if you require permissions for your project, then 1. create a group in linux for your user. 2. give group write access to all files in HDFS. (hadoop dfs -chmod -R g+w / - or sth, I'm not totally sure.) 3. change group ownership of all files in HDFS. (hadoop dfs

Re: Hadoop setup questions

2009-02-13 Thread Rasit OZDAS
With this configuration, any user having that group name will be able to write to any location.. (I've tried this in local network, though) 2009/2/14 Rasit OZDAS rasitoz...@gmail.com: I agree with Amar and James, if you require permissions for your project, then 1. create a group in linux

Hadoop setup questions

2009-02-11 Thread bjday
Good morning everyone, I have a question about correct setup for hadoop. I have 14 Dell computers in a lab. Each connected to the internet and each independent of each other. All run CentOS. Logins are handled by NIS. If userA logs into the master and starts the daemons and UserB logs

Re: Hadoop setup questions

2009-02-11 Thread Amar Kamat
bjday wrote: Good morning everyone, I have a question about correct setup for hadoop. I have 14 Dell computers in a lab. Each connected to the internet and each independent of each other. All run CentOS. Logins are handled by NIS. If userA logs into the master and starts the daemons

Re: Hadoop setup questions

2009-02-11 Thread james warren
Like Amar said. Try adding property namedfs.permissions/name valuefalse/value /property to your conf/hadoop-site.xml file (or flip the value in hadoop-default.xml), restart your daemons and give it a whirl. cheers, -jw On Wed, Feb 11, 2009 at 8:44 PM, Amar Kamat ama...@yahoo-inc.com wrote: