This is how I've done it before: 1.) Create a hadoop user/group. 2.) Make the local filesystem dfs directories writable by the hadoop group and set the sticky bit. 3.) Run hadoop as the hadoop user. 4.) Then add all of your users to the hadoop group.
I also changed the dfs.permissions.supergroup property to "hadoop" in the $HADOOP_HOME/conf/hadoop-site.xml as well. This works pretty well for us. Hope it helps. Cheers, -Xavier -----Original Message----- From: Chris Collins [mailto:[EMAIL PROTECTED] Sent: Wednesday, June 11, 2008 5:18 PM To: [email protected] Subject: Re: client connect as different username? The finer point to this is that in development you may be logged in as user x and have a shared hdfs instance that a number of people are using. In that mode its not practical to sudo as you have all your development tools setup for userx. hdfs is setup with a single user, what is the procedure to add users to that hdfs instance? It has to support it surely? Its really not obvious, looking in the hdfs docs that come with the distro nothing springs out. the hadoop command line tool doesnt have anything that vaguely looks like a way to create a user. Help is greatly appreciated. I am sure its somewhere so blindingly obvious. How are other people doing other that sudoing to one single user name? Thanks ChRiS On Jun 11, 2008, at 5:11 PM, [EMAIL PROTECTED] wrote: > The best way is to use sudo command to execute hadoop client. Does it > work for you? > > Nicholas > > > ----- Original Message ---- >> From: Bob Remeika <[EMAIL PROTECTED]> >> To: [email protected] >> Sent: Wednesday, June 11, 2008 12:56:14 PM >> Subject: client connect as different username? >> >> Apologies if this is an RTM response, but I looked and wasn't able to >> find anything concrete. Is it possible to connect to HDFS via the >> HDFS client under a different username than I am currently logged in >> as? >> >> Here is our situation, I am user bobr on the client machine. I need >> to add something to the HDFS cluster as the user "companyuser". Is >> this possible with the current set of APIs or do I have to upload and >> "chown"? >> >> Thanks, >> Bob >
