Ok this turned out to be quite simple. I'm writing this down just in case somebody faces the same problem. The "mapreduce" repo actually contains builds of "common" and "hdfs" within its lib/ folder. The only thing that's missing is the "scripts" folder (ie $HADOOP_HOME/bin) which contains things like start-all.sh, stop-all.sh. This bin/ can be copied from the "common" repo and once you do that everything works fine.
Cheers, Harish On Fri, Jul 10, 2009 at 10:41 AM, Harish Mallipeddi < [email protected]> wrote: > Any ideas people? > I found this page which includes instructions for core-committers on how to > make a release from SVN (but this looks outdated too). > > http://wiki.apache.org/hadoop/HowToRelease > > Thanks, > Harish > > > On Thu, Jul 9, 2009 at 5:59 PM, Harish Mallipeddi < > [email protected]> wrote: > >> "ant jar" builds a jar. But the project has been split into 3 separate >> entities. There has to be a script which combines the builds from the 3 >> sub-projects and produces one neat hadoop tarball similar to the >> hadoop-0.20.0 release tarball which can be deployed? >> - Harish >> >> >> On Thu, Jul 9, 2009 at 5:40 PM, Mafish Liu <[email protected]> wrote: >> >>> Use "ant jar" if you want to jar file. >>> >>> 2009/7/9 Harish Mallipeddi <[email protected]>: >>> > Hi, >>> > Are there any instructions on how to build Hadoop from source? Now that >>> the >>> > project seems to have been split into separate projects (common, hdfs, >>> and >>> > mapreduce), there are 3 separate repositories under svn. Information on >>> this >>> > page is no longer correct: >>> > http://hadoop.apache.org/core/version_control.html >>> > >>> > I checked out all the three repos and tried building them with "ant". >>> Even >>> > though the builds happened without errors, I'm not sure how do I get a >>> > single hadoop tarball release (similar to hadoop-0.20.0.tar.gz from the >>> > website) from there? Also is the latest trunk of mapreduce supposed to >>> work >>> > with the latest trunk of common and hdfs or will it only work with >>> specific >>> > versions of common and hdfs? >>> > >>> > Cheers, >>> > >>> > -- >>> > Harish Mallipeddi >>> > http://blog.poundbang.in >>> > >>> >>> >>> >>> -- >>> [email protected] >>> Institute of Computing Technology, Chinese Academy of Sciences, Beijing. >>> >> >> >> >> -- >> Harish Mallipeddi >> http://blog.poundbang.in >> > > > > -- > Harish Mallipeddi > http://blog.poundbang.in > -- Harish Mallipeddi http://blog.poundbang.in
