If you started Hadoop daemons with hduser, it will not be shown for the
user foo(or any other user) as hadoop daemons are just java processes.
But still you can run your jobs with any other user. Ensure that the
user foo has access to hadoop directories. And you also don't have to
create a directory in hdfs for the user. I hope this resolves your problem.
hduser $ start-all.sh
hduser $ su - other
other $ /home/hduser/hadoop203/bin/hadoop jar
/home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
-Mohit Kasuhik
On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
Hi Mohit,
Thanks for your reply. Let me elaborate my problem in detail.
I have installed hadoop with user called 'hduser' and the HADOOP_HOME
points to one folder in hduser's home directory . Now I have added
another user foo in the cluster. I modified the access permissions for
following directories to 777:
1. Hadoop installation directory ( pointed by HADOOP_HOME)
2. dfs.datanode.data.dir
3. dfs.namenode.name.dir
4. hadoop.tmp.dir
I have also created directory /user/foo inside hdfs
After starting hdfs and yarn daemons, I am not able to view these
processes in foo user and so not able to submit jobs.
Can you point out what I am missing here?
Thanks
Ravikant
On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik
<[email protected] <mailto:[email protected]>> wrote:
Hadoop uses the linux system users. I think, You don't have to
make any changes, Just create a new user in your system and give
it access to hadoop ie. provide permissions to hadoop installation
and data directories.
-Mohit Kaushik
On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
Hi Hadoop user,
I have hadoop-2.6 installed on my cluster with 11 nodes. I have
installed it under one specific user. Now I want to allow other
users on the cluster to share the same hadoop installation. What
changes I need to do in order to allow access to other users?
Thanks
Ravikant