I do already use Nagios, and have been monitoring the availability
etc, of the network.
But I was hoping to get more insight into the load/workings of the
hadoop network and Ganglia seemed like a good start.
Do you use either Ganglia or Cacti, or something else?
-John
On Nov 12, 2009, at
Definatly check out my presentation above on cloudera's site link is above.
Hadoop specific counters are available. Each component namenode,
datanode, etc has counter objects associated with it.
Hadoop allows you to push statistics at ganglia so this is one nice
option. More of less once you get
We're about in the same boat as you. We use Nagios and have Cacti for other
things so I'll probably use it for hadoop as well. Ganglia seems interesting
but not too simple to setup. We also tried Cloudera Desktop which gives you
a nice interface to see what's happening but it requires using
thanks for the info.
So you are saying to install both cacti and ganglia, which is what I
was kind of thinking to see which one I like the best, and which one
gives the best info.
The only thing is that the ganglia install is not straightforward. Do
you have any recommendations for
Kevin,
What did you think of Cloudera Desktop? Where you able to get it
running with a vanilla hadoop install?
-John
On Nov 12, 2009, at 9:40 AM, Kevin Sweeney wrote:
We're about in the same boat as you. We use Nagios and have Cacti
for other
things so I'll probably use it for hadoop as
Hi All,
I have been trying to set up a hadoop cluster on a number of machines, a few
of which are multicore machines. I have been wondering whether the hadoop
pseudo distribution is something that can help me take advantage of the
multiple cores on my machines. All the tutorials say that the
kvorion wrote:
Hi All,
I have been trying to set up a hadoop cluster on a number of machines, a few
of which are multicore machines. I have been wondering whether the hadoop
pseudo distribution is something that can help me take advantage of the
multiple cores on my machines. All the tutorials
Hi Sid
Check out the Building section in this link -
http://wiki.apache.org/hadoop/HowToRelease . Its pretty straight forward.
If you choose to not remove the test targets expect the build to take
upwards of 2 hours as it runs through all the unit tests.
Kind regards
Steve Watt
From:
Siddu
On 11/11/09 9:46 PM, John Martyniak j...@beforedawnsolutions.com wrote:
Is there a good solution for Hadoop node monitoring? I know that
Cacti and Ganglia are probably the two big ones, but are they the best
ones to use? Easiest to setup? Most thorough reporting, etc.
I started to play
If I understand you correctly you can run jps and see the java jvm's running
on each machine - that should tell you if you are running in pseudo mode or not.
--- On Thu, 11/12/09, kvorion kveinst...@gmail.com wrote:
From: kvorion kveinst...@gmail.com
Subject: About Hadoop pseudo distribution
10 matches
Mail list logo