On Aug 18, 2009, at 2:40 PM, Poole, Samuel [USA] wrote:
I am new to Hadoop (I have not yet installed/configured), and I want to make sure that I have the correct tool for the job. I do not "currently" have a need for the Map/Reduce functionality, but I am interested in using Hadoop for task orchestration, task monitoring, etc. over numerous nodes in a computing cluster. Our primary programs (written in C++ and launched via shell scripts) each run independantly on a single node, but are deployed to different nodes for load balancing. I want to task/initiate these processes on different nodes through a Java program located on a central server. I was hoping to use Hadoop as a foundation for this.
Just create a job with 0 reduces. The map tasks will run independently across the cluster. Take a look at RandomWriter, which just writes a set of random data files.
-- Owen
