[
https://issues.apache.org/jira/browse/HADOOP-6953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12926418#action_12926418
]
Gerald Guo commented on HADOOP-6953:
------------------------------------
You can add following code to files "hdfs-config.sh" and "mapred-config.sh".
this="${BASH_SOURCE-$0}"
while [ -h "$this" ]; do
ls=`ls -ld "$this"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '.*/.*' > /dev/null; then
this="$link"
else
this=`dirname "$this"`/"$link"
fi
done
# convert relative path to absolute path
common_bin=`dirname "$this"`
script=`basename "$this"`
common_bin=`cd "$common_bin"; pwd`
this="$common_bin/$script"
# the root of the Hadoop installation
#TODO: change the env variable when dir structure is changed
export HADOOP_HOME=`dirname "$this"`/..
export HADOOP_COMMON_HOME="${HADOOP_HOME}"
> start-{dfs,mapred}.sh scripts fail if HADOOP_HOME is not set
> ------------------------------------------------------------
>
> Key: HADOOP-6953
> URL: https://issues.apache.org/jira/browse/HADOOP-6953
> Project: Hadoop Common
> Issue Type: Bug
> Components: scripts
> Affects Versions: 0.21.0
> Reporter: Tom White
> Fix For: 0.21.1
>
>
> If the HADOOP_HOME environment variable is not set then the start and stop
> scripts for HDFS and MapReduce fail with "Hadoop common not found.". The
> start-all.sh and stop-all.sh scripts are not affected.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.