[
https://issues.apache.org/jira/browse/HDFS-4198?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13498728#comment-13498728
]
Bikas Saha commented on HDFS-4198:
----------------------------------
If your project allows running on trunk version of hadoop and does not depend
on other Unix things then you could try out the code in hadoop-trunk-win
branch. You can check it out from the hadoop-common git repository and build
it. That branch can build and run hadoop without any unix shell scripts or
Cygwin environment.
> the shell script error for Cygwin on windows7
> ---------------------------------------------
>
> Key: HDFS-4198
> URL: https://issues.apache.org/jira/browse/HDFS-4198
> Project: Hadoop HDFS
> Issue Type: Bug
> Components: scripts
> Affects Versions: 2.0.2-alpha
> Environment: windows7 ,cygwin.
> Reporter: Han Hui Wen
> Fix For: 2.0.3-alpha
>
>
> run /usr/local/hadoop-2.0.2-alpha/sbin/start-all.sh or
> /usr/local/hadoop-2.0.2-alpha/sbin/start-dfs.sh .
> 1. $HADOOP_PREFIX/bin/hdfs getconf -namenodes
> --------------in hadoop-config
> HADOOP_SLAVE_NAMES:
> --------------in hadoop-config-------------
> ------in hadoop-condig.sh before cygpath
> HADOOP_PREFIX:/usr/local/hadoop-2.0.2-alpha
> HADOOP_LOG_DIR:/usr/local/hadoop-2.0.2-alpha/logs
> JAVA_LIBRARY_PATH:
> ------in hadoop-condig.sh before cygpath--------------
> cygpath: can't convert empty path
> ------in hadoop-condig.sh after cygpath
> HADOOP_PREFIX:/usr/local/hadoop-2.0.2-alpha
> HADOOP_LOG_DIR:/usr/local/hadoop-2.0.2-alpha/logs
> JAVA_LIBRARY_PATH:
> ------in hadoop-condig.sh after cygpath--------------
> JAVA_LIBRARY_PATH:/usr/local/hadoop-2.0.2-alpha/lib/native
> ------start to run it in hdfs
> localhost
> henry@IBM-RR0A746AMG4 ~
> $ $HADOOP_PREFIX/bin/hdfs getconf -namenodes
> --------------in hadoop-config
> HADOOP_SLAVE_NAMES:
> --------------in hadoop-config-------------
> cygpath: can't convert empty path
> JAVA_LIBRARY_PATH:/usr/local/hadoop-2.0.2-alpha/lib/native
> ------start to run it in hdfs
> localhost
> ---> if we add log in hadoop-condig.sh . NAMENODES=$($HADOOP_PREFIX/bin/hdfs
> getconf -namenodes) in start-dfs.sh return strange value like above.
> The return is not reliable ,so it can be store in file or pass through
> command parameters.
> 2. some directory needs not translate to Cygwin. here is some related error:
> $ ./start-dfs.sh
> HADOOP_LIBEXEC_DIR:/usr/local/hadoop-2.0.2-alpha/sbin/../libexec
> after hdfs-config.sh
> which: no hdfs in (./C:\cygwin\usr\local\hadoop-2.0.2-alpha/bin)
> dirname: missing operand
> Try `dirname --help' for more information.
> which: no hdfs in (./C:\cygwin\usr\local\hadoop-2.0.2-alpha/bin)
> NAMENODES:localhost
> ]tarting namenodes on [localhost
> ---------------------will run /usr/local/hadoop-2.0.2-alpha/sbin/slaves.sh
> HADOOP_CONF_DIR:/usr/local/hadoop-2.0.2-alpha/etc/hadoop
> HADOOP_PREFIX:C:\cygwin\usr\local\hadoop-2.0.2-alpha
> NAMENODES:
> para:--script /usr/local/hadoop-2.0.2-alpha/sbin/hdfs start namenode
> ---------------------will run
> /usr/local/hadoop-2.0.2-alpha/sbin/slaves.sh-----------------
> --------------------in slaves.sh
> HADOOP_SLAVE_NAMES:localhost
> --------------------in slaves.sh-------------
> SLAVE_NAMES:localhost
> The slave:localhost
> HADOOP_SSH_OPTS:
> : hostname nor servname provided, or not known
> ---------------------will run /usr/local/hadoop-2.0.2-alpha/sbin/slaves.sh
> HADOOP_CONF_DIR:/usr/local/hadoop-2.0.2-alpha/etc/hadoop
> HADOOP_PREFIX:C:\cygwin\usr\local\hadoop-2.0.2-alpha
> NAMENODES:
> para:--script /usr/local/hadoop-2.0.2-alpha/sbin/hdfs start datanode
> ---------------------will run
> /usr/local/hadoop-2.0.2-alpha/sbin/slaves.sh-----------------
> --------------------in slaves.sh
> HADOOP_SLAVE_NAMES:
> --------------------in slaves.sh-------------
> SLAVE_FILE:/usr/local/hadoop-2.0.2-alpha/etc/hadoop/slaves
> SLAVE_NAMES:localhost
> The slave:localhost
> HADOOP_SSH_OPTS:
> localhost: bash: line 0: cd: C:cygwinusrlocalhadoop-2.0.2-alpha: No such file
> or directory
> localhost: ------------in hadoop-daemon.sh
> localhost: hadoopScript:/usr/local/hadoop-2.0.2-alpha/sbin/hdfs
> localhost: ------------in hadoop-daemon.sh--------------
> localhost: datanode running as process 6432. Stop it first.
> ]tarting secondary namenodes [0.0.0.0
> ---------------------will run /usr/local/hadoop-2.0.2-alpha/sbin/slaves.sh
> HADOOP_CONF_DIR:/usr/local/hadoop-2.0.2-alpha/etc/hadoop
> HADOOP_PREFIX:C:\cygwin\usr\local\hadoop-2.0.2-alpha
> NAMENODES:
> para:--script /usr/local/hadoop-2.0.2-alpha/sbin/hdfs start secondarynamenode
> ---------------------will run
> /usr/local/hadoop-2.0.2-alpha/sbin/slaves.sh-----------------
> --------------------in slaves.sh
> HADOOP_SLAVE_NAMES:0.0.0.0
> --------------------in slaves.sh-------------
> SLAVE_NAMES:0.0.0.0
> The slave:0.0.0.0
> HADOOP_SSH_OPTS:
> : hostname nor servname provided, or not known
> which: no hdfs in (./C:\cygwin\usr\local\hadoop-2.0.2-alpha/bin)
> dirname: missing operand
> Try `dirname --help' for more information.
> which: no hdfs in (./C:\cygwin\usr\local\hadoop-2.0.2-alpha/bin)
> --------------> solution:
> Comment some translation in hadoop-config.sh.
> #if $cygwin; then
> #HADOOP_PREFIX=`cygpath -w "$HADOOP_PREFIX"`
> #HADOOP_LOG_DIR=`cygpath -w "$HADOOP_LOG_DIR"`
> #JAVA_LIBRARY_PATH=`cygpath -w "$JAVA_LIBRARY_PATH"`
> #fi
> The best way is that just translate path before Java command running.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira