Hi Frank,

You can use the ClusterMapReduceCase class from org.apache.hadoop.mapred.

Here is an example of adapting it to Junit4 and running test dfs and
cluster.

https://github.com/sonalgoyal/hiho/blob/master/test/co/nubetech/hiho/common/HihoTestCase.java

And here is a blog post that discusses this in detail:
http://nubetech.co/testing-hadoop-map-reduce-jobs

Best Regards,
Sonal
Crux: Reporting for HBase <https://github.com/sonalgoyal/crux>
Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>





On Sat, Aug 27, 2011 at 12:00 AM, Frank Astier <[email protected]>wrote:

> Hi -
>
> Is there a way I can start HDFS (the namenode) from a Java main and run
> unit tests against that? I need to integrate my Java/HDFS program into unit
> tests, and the unit test machine might not have Hadoop installed. I’m
> currently running the unit tests by hand with hadoop jar ... My unit tests
> create a bunch of (small) files in HDFS and manipulate them. I use the fs
> API for that. I don’t have map/reduce jobs (yet!).
>
> Thanks!
>
> Frank
>

Reply via email to