These scripts aren't for tarball installs. They are for package
installs that does not apply to Mac OSX. I haven't a clue what they're
even doing in the release tarball. You should file a JIRA issue to
have them removed.

You just need to follow:
http://hadoop.apache.org/common/docs/current/single_node_setup.html on
your Mac, which details a simple tarball install which'd work.

On Wed, Jan 25, 2012 at 5:05 PM, Naveen M <[email protected]> wrote:
> Hi,
>
> I'm trying to set up hadoop1.0.0 on my mac book, notice a bunch of set up
> files in 'sbin' directory
>
> rwxr-xr-x@  1 root  wheel   3392 16 Dec 03:39 hadoop-create-user.sh
> -rwxr-xr-x@  1 root  wheel   3636 16 Dec 03:39 hadoop-setup-applications.sh
> -rwxr-xr-x@  1 root  wheel  26777 16 Dec 03:39 hadoop-setup-conf.sh
> -rwxr-xr-x@  1 root  wheel   4533 16 Dec 03:39 hadoop-setup-hdfs.sh
> -rwxr-xr-x@  1 root  wheel   6738 16 Dec 03:39 hadoop-setup-single-node.sh
> -rwxr-xr-x@  1 root  wheel   4852 16 Dec 03:39 hadoop-validate-setup.sh
> -rwxr-xr-x@  1 root  wheel   4210 16 Dec 03:39 update-hadoop-env.sh
>
> I've attempted run 'hadoop-setup-single-node.sh'  with all default settings
> ( and 'y' s)
>
> This is what I've got..
>
> Proceed with generate configuration? (y/N) y
> chmod: -R: No such file or directory
> chmod: -R: No such file or directory
> chmod: -R: No such file or directory
> mkdir: /home/mr: Operation not supported
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> chown: hadoop: Invalid argument
> Configuration setup is completed.
> Proceed to run hadoop-setup-hdfs.sh on namenode.
> chown: hadoop: Invalid argument
> ./hadoop-setup-single-node.sh: line 169: /etc/init.d/hadoop-namenode: No
> such file or directory
> ./hadoop-setup-single-node.sh: line 179: /etc/init.d/hadoop-namenode: No
> such file or directory
> ./hadoop-setup-single-node.sh: line 180: /etc/init.d/hadoop-datanode: No
> such file or directory
> su: illegal option -- c
> usage: su [-] [-flm] [login [args]]
> su: illegal option -- c
> usage: su [-] [-flm] [login [args]]
> su: illegal option -- c
> usage: su [-] [-flm] [login [args]]
> su: illegal option -- c
> usage: su [-] [-flm] [login [args]]
> ./hadoop-setup-single-node.sh: line 187: /etc/init.d/hadoop-jobtracker: No
> such file or directory
> ./hadoop-setup-single-node.sh: line 188: /etc/init.d/hadoop-tasktracker: No
> such file or directory
>
>
> Whats the correct order to run these scripts. If you could point me to any
> documentation that would be great..
>
> --
> Regards,
>
> Naveen Mukkelli



-- 
Harsh J
Customer Ops. Engineer, Cloudera

Reply via email to