Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "GitAndHadoop" page has been changed by SteveLoughran.
The comment on this change is: filled out the build section.
http://wiki.apache.org/hadoop/GitAndHadoop?action=diff&rev1=1&rev2=2

--------------------------------------------------

  
  == Before you begin ==
  
- You need a copy of git on your system. Some IDEs ship with Git support; this 
page assumes you are using the command line.
+  1. You need a copy of git on your system. Some IDEs ship with Git support; 
this page assumes you are using the command line.
+  1. You need a copy of ant 1.7+ on your system for the builds themselves.
+  1. You need to be online for your first checkout and build.
+  1. You need to set Ant up so that it works with any proxy you have. This is 
documented by [[http://ant.apache.org/manual/proxy.html |the ant team]].
+ 
  
  == Checking out the source ==
  
@@ -31, +35 @@

  }}}
  The total download is well over 100MB, so the initial checkout process works 
best when the network is fast. Once downloaded, Git works offline.
  
+ == Building the source ==
+ 
+ You need to tell all the Hadoop modules to get a local JAR of the bits of 
Hadoop they depend on. You do this by making sure your Hadoop version does not 
match anything public, and to use the "internal" repository of locally 
published artifacts.
+ 
+ === Create a build.properties file ===
+ 
+ Create a {{{build.properties}}} file. Do not do this in the git directories, 
do it one up. This is going to be a shared file. This article assumes you are 
using Linux or a different Unix, incidentally.
+ 
+ Make the file something like this:
+ {{{
+ #this is essential
+ resolvers=internal
+ #you can increment this number as you see fit
+ version=0.22.0-alpha-1
+ hadoop.version=${version}
+ hadoop-core.version=${version}
+ hadoop-hdfs.version=${version}
+ hadoop-mapred.version=${version}
+ }}}
+ 
+ Next, symlink this file to every Hadoop module. Now a change in the file gets 
picked up by all three.
+ {{{
+ ln -s build.properties hadoop-common/build.properties
+ ln -s build.properties hadoop-hdfs/build.properties
+ ln -s build.properties hadoop-mapreduce/build.properties
+ }}}
+ 
+ You are all set up to build.
+ 
+ === Build Hadoop ===
+ 
+  1. In {{{hadoop-common/}}} run {{{ant mvn-install}}}
+  1. In {{{hadoop-hdfs/}}} run {{{ant mvn-install}}}
+  1. In {{{hadoop-mapreduce/}}} run {{{ant mvn-install}}}
+ 
+ This Ant target not only builds the JAR files, it copies it to the local 
{{{${user.home}/.m2}}} directory, where it will be picked up by the "internal" 
resolver. You can check that this is taking place by running {{{ant 
ivy-report}}} on a project and seeing where it gets its dependencies.
+ 
+ === Testing ===
+ 
+ Each project comes with lots of tests; run {{{ant test}}} to run them. If you 
have made changes to the build and tests fail, it may be that the tests never 
worked on your machine. Build and test the unmodified source first. Then keep 
an eye on both the main source and any branch you make. A good way to do this 
is to give a Continuous Integration server such as Hudson this job: checking 
out, building and testing both branches.
+ 

Reply via email to