For #1, you can get the tar ball from http://www.apache.org/dyn/closer.cgi/hadoop/common/ e.g. http://www.motorlogy.com/apache/hadoop/common/hadoop-2.1.0-beta/
It is in maven too: http://mvnrepository.com/artifact/org.apache.hadoop/ For #2, see https://code.google.com/p/protobuf/ On Sat, Sep 14, 2013 at 10:54 AM, Mahmoud Al-Ewiwi <[email protected]> wrote: > Hello, > > I'm new to Hadoop and i want to learn it in order to do a project. > I'v started reading the documentation at this site: > > > http://hadoop.apache.org/docs/r2.1.0-beta/hadoop-project-dist/hadoop-common/SingleCluster.html > > for setting a single node, but i could not figure a lot of things in these > documentation. > > 1."You should be able to obtain the MapReduce tarball from the release" > > I could not find this tarball, where is it. > > 2."You will need protoc 2.5.0 installed" > > what is that, there is no even a link for it or what is it > > 3."Assuming you have installed hadoop-common/hadoop-hdfs" > > what also are these, and why you are assuming that. i have just downlaod > the hadoop-2.1.0-beta< > http://ftp.itu.edu.tr/Mirror/Apache/hadoop/common/hadoop-2.1.0-beta/>and > extracted it > > 4. and exported *$HADOOP_COMMON_HOME*/*$HADOOP_HDFS_HOME* > > !! that is strange, to where should these environment variables indicating > > lastly as i know,the first step tutorial should give more details. or am i > searching the wrong side. > ** > ** >
