As a follow-up to the discussion on irc - here is a thread to address
some frustrations / pain points w.r.t ivy .
More often than not - when it involves patching hadoop / hdfs and
testing them with hbase - ivy comes in the way and clearing ivy cache to
start all over , is a production impediment.
To work around the same- the following strategies should work best.
* Checkout hadoop / hdfs from svn as appropriate.
* Change the version of hadoop/hdfs , on which patched need to be
applied and played around to something non-conflicting with central
repositories ( say - 0.20.8 , 0.21.8 etc. ).
hadoop/hdfs:
--------------
Add the following target to the build.xml of hadoop / hdfs , where
"local" would be a local file system resolver set up internally.
<target name="local.publish" depends="jar">
<ivy:publish resolver="*local*" forcedeliver="true" overwrite="true">
<artifacts pattern="./build/[module]-[revision].[ext]"/>
</ivy:publish>
</target>
To make this work with hbase -
$ ant local.publish
to publish artifacts to local resolver
hbase:
------
* Use the same file system resolver as that was used to publish in
hadoop/hdfs
* Change libararies.properties to 0.20.8 / 0.21.8 as appropriate .
* ant clean package
So - now hbase would be running with the patched hdfs / hadoop-common as
we want it to be.
Hope this helps.