Anton, This has been in my "reply to this" queue for a while, I finally have time to get back to it....
This is exactly what I am doing in my environment -- I have gmond
running on a whole flock of servers each forwarding data of to a central
collector server that is running gmetad. Each "cluster" runs on its own
port and on the collector there is a gmond for each running on the same
port. Then gmetad is configured to talk to each of those gmond
processes to collect data.
For example, my web servers are configured with:
udp_send_channel {
host = <collectorhost>
port = 8654
}
On the collector host I have a gmond-8654.conf (see my other e-mail on
how I start multiple gmond processes) that has:
udp_recv_channel {
port = 8654
}
tcp_accept_channel {
port = 8654
}
And then in gmetad.conf I have:
data_source "Unknown" <collectorhost>:8649
data_source "Monitoring Servers" <collectorhost>:8650
data_source "DNS_LDAP Servers" <collectorhost>:8653
data_source "Web Servers" <collectorhost>:8654
data_source "Squid Servers" <collectorhost>:8655
...
That gives me a single gmetad interface with all of my "clusters" on it,
where each "cluster" is a group of servers with a similar function. I
keep the Unknown config to catch gmond processes that didn't get a
custom config, our package install uses the default port of 8649 00 if
everything is configured correctly I shouldn't ever see any hosts in
"Unknown".
Anton Yurchenko wrote:
> We are running fully unicast.
> Its good to hear that this should work, but something is missing here.
> Does any of the members here have some ideas to test/verify what may be
> missing in our setup?
>
> Thanks!
>
> On 10/15/2010 12:31 AM, Martin Knoblauch wrote:
>
>> Hi Anton,
>>
>> are you using a multiast or unicast setup? Unicast should work just fine.
>> At
>> least it did in 3.0.x. For multicast you *may* also need to run on distinct
>> mc-addresseses in addition to the distinct ports. but I never tested that.
>>
>> Cheers
>>
>> Martin
>> ------------------------------------------------------
>> Martin Knoblauch
>> email: k n o b i AT knobisoft DOT de
>> www: http://www.knobisoft.de
>>
>>
>>
>> ----- Original Message ----
>>
>>> From: David Birdsong<[email protected]>
>>> To: Anton Yurchenko<[email protected]>
>>> Cc: [email protected]
>>> Sent: Fri, October 15, 2010 1:25:11 AM
>>> Subject: Re: [Ganglia-general] Running multiple gmonds on the same server
>>>
>>> I'm not there anymore, but I think it was 3.1.2.
>>>
>>> On Thu, Oct 14, 2010 at 4:23 PM, Anton Yurchenko<[email protected]>
>>> wrote:
>>>
>>>> Well that is good to know :)
>>>> What version of ganglia are you running?
>>>>
>>>> Thanks!
>>>>
>>>>
>>>> On 10/14/2010 4:21 PM, David Birdsong wrote:
>>>>
>>>>> FYI, we did exactly this for ~4-5 clusters at my last installation.
>>>>> It worked fine.
>>>>>
>>>>> On Thu, Oct 14, 2010 at 4:16 PM, Anton Yurchenko<[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Hi all,
>>>>>>
>>>>>> I am tying to consolidate all the gmond aggregation nodes for 3
>>>>>> clusters that we have on a pair of servers.
>>>>>> I tried to have gmond for each cluster run on it own set of ports, but
>>>>>> its not working very well.
>>>>>> In ganlia UI for the clusters I can see the number of hosts is correct,
>>>>>> but none of the other metrics are showing.
>>>>>> Is this not the right approach for running gmond for multiple clusters?
>>>>>>
>>>>>> Thanks!
>>>>>> Anton
>>>>>>
>>>>>>
--
Dan Rich <[email protected]> | http://www.employees.org/~drich/
| "Step up to red alert!" "Are you sure, sir?
| It means changing the bulb in the sign..."
| - Red Dwarf (BBC)
signature.asc
Description: OpenPGP digital signature
------------------------------------------------------------------------------ Nokia and AT&T present the 2010 Calling All Innovators-North America contest Create new apps & games for the Nokia N8 for consumers in U.S. and Canada $10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________ Ganglia-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/ganglia-general

