Other users should be able to submit jobs using the same commands (bin/hadoop ...). Are there errors you ran into? One thing is that you'll need to grant them permissions over any files in HDFS that you want them to read. You can do it using bin/hadoop fs -chmod, which works like chmod on Linux. You may need to run this as the root user (sudo bin/hadoop fs -chmod). Also, I don't remember exactly, but you may need to create home directories for them in HDFS as well (again create them as root, and then sudo bin/hadoop fs -chown them).
On Tue, Feb 17, 2009 at 10:48 AM, Nicholas Loulloudes < [email protected]> wrote: > Hi all, > > I just installed Hadoop (Single Node) on a Linux Ubuntu distribution as > per the instructions found in the following website: > > > http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster) > > I followed the instructions of the website to create a "hadoop" system > user and group and i was able to run a Map Reduce job successfully. > > What i want to do now is to create more system users which will be able > to use Hadoop for running Map Reduce jobs. > > Is there any guide on how to achieve these?? > > Any suggestions will be highly appreciated. > > Thanks in advance, > > -- > _________________________________________________ > > Nicholas Loulloudes > High Performance Computing Systems Laboratory (HPCL) > University of Cyprus, > Nicosia, Cyprus > > > > >
