On Wed, Aug 12, 2009 at 12:05 PM, roman kolcun<[email protected]> wrote:
> Hello everyone,
> I would like to ask what is the easiest way to build and deploy Hadoop. I am
> trying to add some new features to the Hadoop but every time I make any
> small change (while debugging) I have to execute:
>
> ant clean compile jar
>
> than upload the build/hadoop-0.20.1-dev-core.jar file to the cluster and
> distribute it to every node. This already takes approx 2 minutes.
> In addition the Hadoop does not pick up the new core file and I have to
> reboot the whole cluster. And rebooting the whole cluster takes several
> minutes.
> So it is very frustrating when after 5+ minutes I got some stupid
> NullPointerException and I need to fix it and wait another 5+ minutes to see
> another Exception.
> Is there any other way how to speed up the process?
>
> Thank you for every comment.
>
> Roman
>
> PS: I am using Hadoop 0.20.0
>

Roman,

You should really take advantage of the TestCase infrastructure. While
developing TestCases can be time consuming at first, once you get into
a good flow you realize you can not live without it.

For example, I contributed the web Inferface to Hadoop Hive. Initially
I tried to get by without a strong test case. I would have to kick up
a web interface and fill out several web forms to test my process, but
I came to realize doing the TestCase work up front saves time in the
long run.

Hive/Hadoop is slightly different but the same principals apply. I can test:

ant -Dhadoop.version='0.18.3' -Dtestcase=TestHWISessionManager test

http://svn.apache.org/viewvc/hadoop/hive/trunk/hwi/src/test/org/apache/hadoop/hive/hwi/TestHWISessionManager.java?revision=758836

With this test case I can make sure the underlying features work before deploy.

Reply via email to