Kay Kay wrote:
Start with hadoop-common to start building .
hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot
repository that contains the nightlies of last successful builds so in
theory all 3 could be built independently because of the respective
snapshots being present in apache snapshot repository.
If you do want to make cross-project changes and test them -
* create a local ivy resolver and
* place it before the apache snapshot in the ivy settings .
* publish the jar for a given project to the directory pointed by the
ivy resolver in step 1
* clear ivy cache
* recompile.
I think you just need to tweak build.properties to use the internal
resolver.
This is what I have symlinked into -common, -hdfs and -mapred
#This is symlinked across projects!
patch.version=1
resolvers=internal
#you can increment this number as you see fit
version=0.22.0-alpha-1
hadoop.version=${version}
hadoop-core.version=${version}
hadoop-hdfs.version=${version}
hadoop-mapred.version=${version}