Along with what Ted says... Find someone with a packaged demo or continue trying to manage your way through the install.
This is why some vendors are succesfull packaging Hadoop and offering support services: it's not always an easy task. On Sat, Sep 14, 2013 at 2:54 PM, Mahmoud Al-Ewiwi <[email protected]> wrote: > Hello, > I'm new to Hadoop and i want to learn it in order to do a project. > I'v started reading the documentation at this site: > http://hadoop.apache.org/docs/r2.1.0-beta/hadoop-project-dist/hadoop-common/SingleCluster.html > for setting a single node, but i could not figure a lot of things in these > documentation. > 1."You should be able to obtain the MapReduce tarball from the release" > I could not find this tarball, where is it. > 2."You will need protoc 2.5.0 installed" > what is that, there is no even a link for it or what is it > 3."Assuming you have installed hadoop-common/hadoop-hdfs" > what also are these, and why you are assuming that. i have just downlaod > the > hadoop-2.1.0-beta<http://ftp.itu.edu.tr/Mirror/Apache/hadoop/common/hadoop-2.1.0-beta/>and > extracted it > 4. and exported *$HADOOP_COMMON_HOME*/*$HADOOP_HDFS_HOME* > !! that is strange, to where should these environment variables indicating > lastly as i know,the first step tutorial should give more details. or am i > searching the wrong side. > ** > **
