On May 10, 2011 17:54:27 Keith Thompson wrote: > Thanks for catching that comma. It was actually my HADOOP_CONF_DIR rather > than HADOOP_HOME that was the culprit. :) > As for sudo ... I am not sure how to run it as a regular user. I set up > ssh for a passwordless login (and am able to ssh localhost without > password) but I installed hadoop to /usr/local so every time I try to run > it, it says permission denied. So, I have to run hadoop using sudo (and it > prompts for password as super user). I should have installed hadoop to my > home directory instead I guess ... :/
I'd say, for running tests with a pseudo-cluster on a single machine it would be easiest for you to extract the archive somewhere in your home directory, as your regular user. If you extracted the archive as root into /usr/local, then your regular user is probably missing read and execute permissions on various files and directories. With the default configuration, the hadoop user also needs to have write permission to the logs directory. I'd say it seems reasonable to delay those concerns for when you'll run on more than one machine. -- Luca Pireddu CRS4 - Distributed Computing Group Loc. Pixina Manna Edificio 1 Pula 09010 (CA), Italy Tel: +39 0709250452