[
https://issues.apache.org/jira/browse/HADOOP-6953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13009550#comment-13009550
]
thron_xv commented on HADOOP-6953:
----------------------------------
To avoid this issue, you can configure HADOOP_HOME in environment.
For example in Cygwin, just do as below:
export HADOOP_HOME=/cygdrive/c/hadoop/
> start-{dfs,mapred}.sh scripts fail if HADOOP_HOME is not set
> ------------------------------------------------------------
>
> Key: HADOOP-6953
> URL: https://issues.apache.org/jira/browse/HADOOP-6953
> Project: Hadoop Common
> Issue Type: Bug
> Components: scripts
> Affects Versions: 0.21.0
> Reporter: Tom White
> Assignee: Tom White
> Priority: Blocker
> Fix For: 0.21.1, 0.22.0
>
>
> If the HADOOP_HOME environment variable is not set then the start and stop
> scripts for HDFS and MapReduce fail with "Hadoop common not found.". The
> start-all.sh and stop-all.sh scripts are not affected.
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira