For commands on multiple machines, you can use Capistrano's shell utility. An added bonus is that you can write all sorts of more complicated processes using Ruby if you want to.

www.capify.org

-Bryan

On Apr 29, 2008, at 1:36 PM, Bradford Stephens wrote:

Greetings,

I'm compiling a list of (free/OSS) tools commonly used to administer Linux
clusters to help my company transition away from Win solutions.

I use Ganglia for monitoring the general stats of the machines (Although I didn't get the hadoop metrics to work). I also use ntop to check out network
performance (especially with Nutch).

What do you all use to run your Hadoop clusters? I haven't found a good tool to let me run a command on multiple machines and examine the output, yet.

Cheers,
Bradford

Reply via email to