hey goutham, we've recently (mesos-0.12.0) rewrote the hadoop port on mesos. the old hadoop port is no longer supported. we will be sending out an email soon with a 0.12.0 release candidate which you can try. in the meanwhile, you could try to build mesos from the 0.12.x branch of our git repo: https://git-wip-us.apache.org/repos/asf?p=incubator-mesos.git.
On Thu, Jun 6, 2013 at 11:44 PM, Goutham Tadi <[email protected]> wrote: > > [root@10 mesos-0.10.0]# ls > > aclocal.m4 config.guess config.status configure.ac hadoop > libtool m4 Makefile.in NOTICE src > bin > config.log config.sub depcomp include LICENSE > Makefile missing protobuf-2.4.1.jar support > bootstrap > config.lt configure ec2 install-sh ltmain.sh > Makefile.am mpi README third_party > > > [root@10 mesos-0.10.0]# cd hadoop/ > [root@10 hadoop]# ls > > hadoop-0.20.205.0_hadoop-env.sh.patch > hadoop-0.20.205.0.patch > hadoop-0.20.2-cdh3u3_mesos.patch Makefile.am mapred-site.xml.patch > mesos-executor > hadoop-0.20.205.0_mesos.patch > hadoop-0.20.2-cdh3u3_hadoop-env.sh.patch > hadoop-0.20.2-cdh3u3.patch Makefile.in mesos > TUTORIAL.sh > > [root@10 hadoop]# ./TUTORIAL.sh 0.20.2-cdh3u3 > > Welcome to the tutorial on running Apache Hadoop on top of Mesos! > During this interactive guide we'll ask some yes/no > questions and you should enter your answer via 'Y' or 'y' for yes and > 'N' or 'n' for no. > > Let's begin! > > > We'll try and grab hadoop-0.20.2-cdh3u3 for you now via: > > $ wget http://archive.cloudera.com/cdh/3/hadoop-0.20.2-cdh3u3.tar.gz > > Hit enter to continue. > --2013-06-07 12:00:18-- > http://archive.cloudera.com/cdh/3/hadoop-0.20.2-cdh3u3.tar.gz > Connecting to 10.138.90.12:8000... connected. > Proxy request sent, awaiting response... 200 OK > Length: 69684945 (66M) [application/x-gzip] > Saving to: “hadoop-0.20.2-cdh3u3.tar.gz” > > > 100%[==============================================================================================================================>] > 69,684,945 162K/s in 8m 32s > > 2013-06-07 12:08:51 (133 KB/s) - “hadoop-0.20.2-cdh3u3.tar.gz” > saved [69684945/69684945] > > > Let's start by extracting hadoop-0.20.2-cdh3u3.tar.gz: > > $ tar zxvf hadoop-0.20.2-cdh3u3.tar.gz > > Hit enter to continue. > > ...... > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/image/create-hadoop-image-remote > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/image/ec2-run-user-data > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/launch-hadoop-cluster > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/launch-hadoop-master > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/launch-hadoop-slaves > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/list-hadoop-clusters > hadoop-0.20.2-cdh3u3/src/contrib/ec2/bin/terminate-hadoop-cluster > hadoop-0.20.2-cdh3u3/src/examples/pipes/configure > > Okay, now let's change into the hadoop-0.20.2-cdh3u3 directory in order to > apply > some patches, copy in the Mesos specific code, and build everything. > > $ cd hadoop-0.20.2-cdh3u3 > > Hit enter to continue. > To run Hadoop on Mesos we need to apply a rather minor patch. The > patch makes a small number of modifications in Hadoop. (Note that the > changes to Hadoop have been committed in revisions r1033804 and > r987589 so at some point we won't need to apply any patch at all.) > We'll apply the patch with: > > $ patch -p1 <../hadoop-0.20.2-cdh3u3.patch > > Hit enter to continue. > > patching file src/mapred/org/apache/hadoop/mapred/JobInProgress.java > patching file src/mapred/org/apache/hadoop/mapred/Task.java > patching file src/mapred/org/apache/hadoop/mapred/TaskRunner.java > patching file src/mapred/org/apache/hadoop/mapred/TaskTracker.java > Hunk #4 succeeded at 2036 (offset 22 lines). > Hunk #5 succeeded at 2204 (offset 22 lines). > Hunk #6 succeeded at 2313 (offset 22 lines). > Hunk #7 succeeded at 2979 (offset 22 lines). > Hunk #8 succeeded at 3186 (offset 22 lines). > Hunk #9 succeeded at 3327 (offset 22 lines). > Hunk #10 succeeded at 3362 (offset 22 lines). > Hunk #11 succeeded at 3548 (offset 22 lines). > patching file > src/mapred/org/apache/hadoop/mapred/TaskTrackerInstrumentation.java > > Now we'll copy over the Mesos contrib components. In addition, we'll > need to edit ivy/libraries.properties and src/contrib/build.xml to > hook the Mesos contrib componenet into the build. We've included a > patch to do that for you: > > $ cp -r ../mesos src/contrib > $ cp -p ../mesos-executor bin > $ patch -p1 <../hadoop-0.20.2-cdh3u3_mesos.patch > Hit enter to continue. > > patching file ivy/libraries.properties > patching file src/contrib/build.xml > > Okay, now we're ready to build and then run Hadoop! There are a couple > important considerations. First, we need to locate the Mesos JAR and > native library (i.e., libmesos.so on Linux and libmesos.dylib on Mac > OS X). The Mesos JAR is used for both building and running, while the > native library is only used for running. In addition, we need to > locate the Protobuf JAR (if you don't already have one one your > default classpath). > > This tutorial assumes you've built Mesos already. We'll use the > environment variable MESOS_BUILD_DIR to denote this directory. > > Hit enter to continue. > Using /root/mesos-0.10.0 as the build directory. > > > Now we'll copy over the necessary libraries we need from the build > directory. > > $ cp /root/mesos-0.10.0/protobuf-2.4.1.jar lib > $ cp /root/mesos-0.10.0/src/mesos-0.10.0.jar lib > $ mkdir -p lib/native/Linux-amd64-64 > $ cp /root/mesos-0.10.0/src/.libs/libmesos.so lib/native/Linux-amd64-64 > > > Okay, let's try building Hadoop and the Mesos contrib classes: > > $ ant > > Hit enter to continue. > Buildfile: build.xml > [exec] [INFO] Scanning for projects... > > [exec] Downloading: > > https://repository.cloudera.com/content/groups/cdh-releases-rcs/com/cloudera/cdh/hadoop-root/3.0-u3/hadoop-root-3.0-u3.pom > [exec] > > [exec] Downloading: > > https://repository.cloudera.com/content/repositories/snapshots/com/cloudera/cdh/hadoop-root/3.0-u3/hadoop-root-3.0-u3.pom > [exec] Downloading: > http://repo.maven.apache.org/maven2/com/cloudera/cdh/hadoop-root/3.0-u3/hadoop-root-3.0-u3.pom > [exec] [ERROR] The build could not read 1 project -> [Help 1][ERROR] > > [exec] [ERROR] The project > com.cloudera.cdh:hadoop-ant:${cdh.parent.version} > (/root/mesos-0.10.0/hadoop/hadoop-0.20.2-cdh3u3/cloudera-pom.xml) has 1 > error > > [exec] [ERROR] Non-resolvable parent POM: Could not transfer > artifact com.cloudera.cdh:hadoop-root:pom:3.0-u3 from/to > cdh.releases.repo > (https://repository.cloudera.com/content/groups/cdh-releases-rcs): > Connection to https://repository.cloudera.com refused and > 'parent.relativePath' points at wrong local POM @ line 26, column 11: > Connection refused -> [Help 2] > [exec] [ERROR] > [exec] [ERROR] To see the full stack trace of the errors, re-run > Maven with the -e switch. > [exec] [ERROR] Re-run Maven using the -X switch to enable full debug > logging. > [exec] [ERROR] > [exec] [ERROR] For more information about the errors and possible > solutions, please read the following articles: > [exec] [ERROR] [Help 1] > http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException > [exec] [ERROR] [Help 2] > http://cwiki.apache.org/confluence/display/MAVEN/UnresolvableModelException > > BUILD FAILED > /root/mesos-0.10.0/hadoop/hadoop-0.20.2-cdh3u3/build.xml:42: exec > returned: 1 > > Total time: 1 second > > Oh no! We failed to run 'ant'. If you need help try emailing: > > [email protected] > > (Remember to include as much debug information as possible.) > > > Thanks > Goutam Tadi > =====-----=====-----===== > Notice: The information contained in this e-mail > message and/or attachments to it may contain > confidential or privileged information. If you are > not the intended recipient, any dissemination, use, > review, distribution, printing or copying of the > information contained in this e-mail message > and/or attachments to it are strictly prohibited. If > you have received this communication in error, > please notify us by reply e-mail or telephone and > immediately and permanently delete the message > and any attachments. Thank you > > >
