Hello Arun,

On Sun, Sep 18, 2011 at 11:59 AM, ArunKumar <[email protected]> wrote:
> Hi !
>
> I have set up hadoop on my machine as per
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
> I am able to run application with capacity scheduler by submit jobs to a
> paricular queue from owner of hadoop "hduser".

You don't need to run Hadoop daemons as a separate user to run jobs
from them. Run Hadoop from 'hduser' and submit jobs from any other
user just like normal.

As another user, try running a sample job this way:

hduser $ start-all.sh
hduser $ su - other
other $ /home/hduser/hadoop203/bin/hadoop jar
/home/hduser/hadoop203/hadoop-examples*.jar pi 1 1

What's the error you get if the above fails? (Ensuring your 'other'
user has proper permissions to utilize /home/hduser/hadoop203). Can
you also paste in your mapreduce + capacity scheduler configurations
if things fail as user 'other'?

> I tried this from other user :
> 1. Configured ssh
> 2. Changed the hadoop exract's permission to 777.
> 3. Updated $HOME/.bashrc as per above link
> 4. Changed hadoop.tmp.dir permission to 777.
> 5. $bin/start-all.sh gives

Unsure why you want to start hadoop as another user. I don't think
thats your goal - you're looking to submit jobs as another user if I
understand right. In that case, you don't have to start daemons as a
new user as well.


-- 
Harsh J

Reply via email to