Hello Mahmoud, You are not the only one who thinks the Apache Hadoop installation documentation is unclear.
You may want to train with a Hadoop trainer instead of going the pain-staking process of reading through Apache Hadoop setup documentation. Check out Lion Data Systems. They have web-based and in-person Apache Hadoop training courses to numb your pain. They also train people on R as well: liondatasystems.com/courses - Ray On Sat, Sep 14, 2013 at 10:54 AM, Mahmoud Al-Ewiwi <[email protected]> wrote: > Hello, > > I'm new to Hadoop and i want to learn it in order to do a project. > I'v started reading the documentation at this site: > > > http://hadoop.apache.org/docs/r2.1.0-beta/hadoop-project-dist/hadoop-common/SingleCluster.html > > for setting a single node, but i could not figure a lot of things in these > documentation. > > 1."You should be able to obtain the MapReduce tarball from the release" > > I could not find this tarball, where is it. > > 2."You will need protoc 2.5.0 installed" > > what is that, there is no even a link for it or what is it > > 3."Assuming you have installed hadoop-common/hadoop-hdfs" > > what also are these, and why you are assuming that. i have just downlaod > the hadoop-2.1.0-beta< > http://ftp.itu.edu.tr/Mirror/Apache/hadoop/common/hadoop-2.1.0-beta/>and > extracted it > > 4. and exported *$HADOOP_COMMON_HOME*/*$HADOOP_HDFS_HOME* > > !! that is strange, to where should these environment variables indicating > > lastly as i know,the first step tutorial should give more details. or am i > searching the wrong side. > ** > ** >
