Frank,

Firing up a MiniDFSCluster instance with the hadoop-test and
hadoop-core jars available lets you run a single-JVM HDFS service.

Also do know that Hadoop works well on the local file system itself,
and usage of that is also via the same FileSystem interface, so where
possible, its best to avoid running services that'd slow your tests
down and try to use the local FileSystem instead.

On Sat, Aug 27, 2011 at 12:00 AM, Frank Astier <[email protected]> wrote:
> Hi -
>
> Is there a way I can start HDFS (the namenode) from a Java main and run unit 
> tests against that? I need to integrate my Java/HDFS program into unit tests, 
> and the unit test machine might not have Hadoop installed. I’m currently 
> running the unit tests by hand with hadoop jar ... My unit tests create a 
> bunch of (small) files in HDFS and manipulate them. I use the fs API for 
> that. I don’t have map/reduce jobs (yet!).
>
> Thanks!
>
> Frank
>



-- 
Harsh J

Reply via email to