If you ran hadoop process under account 'hadoop', and set the hadoop data
directory to a particular directory, you need make sure that your hadoop
account can write to that directory.

On Jan 2, 2008 2:06 PM, Natarajan, Senthil <[EMAIL PROTECTED]> wrote:

> I Just uncommented and changed the JAVA_HOME, that's all I did in
> hadoop-env.sh.
> Do I need to configure anything else.
>
> Here is the hadoop-env.sh
>
>
> # Set Hadoop-specific environment variables here.
>
> # The only required environment variable is JAVA_HOME.  All others are
> # optional.  When running a distributed configuration it is best to
> # set JAVA_HOME in this file, so that it is correctly defined on
> # remote nodes.
>
> # The java implementation to use.  Required.
>  export JAVA_HOME=/usr/local/GridComputing/software/jdk1.5.0_04
>
> # Extra Java CLASSPATH elements.  Optional.
> # export HADOOP_CLASSPATH=
>
> # The maximum amount of heap to use, in MB. Default is 1000.
> # export HADOOP_HEAPSIZE=2000
>
> # Extra Java runtime options.  Empty by default.
> # export HADOOP_OPTS=-server
>
> # Extra ssh options.  Empty by default.
> # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
>
> # Where log files are stored.  $HADOOP_HOME/logs by default.
> # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
>
> # File naming remote slave hosts.  $HADOOP_HOME/conf/slaves by default.
> # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
>
> # host:path where hadoop code should be rsync'd from.  Unset by default.
> # export HADOOP_MASTER=master:/home/$USER/src/hadoop
>
> # Seconds to sleep between slave commands.  Unset by default.  This
> # can be useful in large clusters, where, e.g., slave rsyncs can
> # otherwise arrive faster than the master can service them.
> # export HADOOP_SLAVE_SLEEP=0.1
>
> # The directory where pid files are stored. /tmp by default.
> # export HADOOP_PID_DIR=/var/hadoop/pids
>
> # A string representing this instance of hadoop. $USER by default.
> # export HADOOP_IDENT_STRING=$USER
>
> # The scheduling priority for daemon processes.  See 'man nice'.
> # export HADOOP_NICENESS=10
>
> -----Original Message-----
> From: Ted Dunning [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, January 02, 2008 5:02 PM
> To: hadoop-user@lucene.apache.org
> Subject: Re: Datanode Problem
>
>
> Well, you have something very strange going on in your scripts.  Have you
> looked at hadoop-env.sh?
>
>
> On 1/2/08 1:58 PM, "Natarajan, Senthil" <[EMAIL PROTECTED]> wrote:
>
> >> /bin/bash: /root/.bashrc: Permission denied
> >> localhost: ssh: localhost: Name or service not known
> >> /bin/bash: /root/.bashrc: Permission denied
> >> localhost: ssh: localhost: Name or service not known
>
>


-- 
tp

Reply via email to