On Tue, 2007-11-06 at 13:23 -0600, [EMAIL PROTECTED] wrote:

> - Provide graphs as a web service
> This would accept requests for metric data (XML, JSON, whichever),  
> generate a graph, and respond with a URL to the graph.  This would  
> make it possible to embed metrics in other web applications (PHP or  
> non-PHP) without much fuss.

I was working on a similar idea about a year ago with the help of a
student.  The project has been idle due to "resource starvation", but I
thought I would explain my ideas in case you find any of them useful.

My idea was to provide a few simple means of accessing the RRD data from
ganglia.  I had two php files: data.php (which returned RRD data in XML
form) and graph.php (which returned a graph of RRD data).  They were
both invoked with URLs that looked something like this:

graph.php?start=..&end=..&line=..&label=..&color=..&metric=..&host=&cluster=..

Proper values were inserted where appropriate, and of course many of
these parameters (like "color") were not applicable to data.php.  In
essence, this was little more than a thin wrapper around a call to
rrdtool.

This allowed access to the data on the server running gmetad without
needing to install the entire web frontend.  The goal was to build other
applications around these two simple "web services", much like you
mentioned in your email.

As a proof-of-concept project, I had the student integrate the ganglia
data into a PHP app we use to access PBS batch job info.  The original
PHP app presented the user with a simple form (which would generate
MySQL queries behind the scenes).  It then returned a pretty table which
showed PBS job info like when the job started/ended and what nodes the
job ran on.  My student made a few modification to the app.  One of them
turned references to hostnames into HTML links.  Each hostname link
would go to a page that showed ganglia stats for that host during the
time window that the PBS job ran.  The hope was that this info could be
used to look at host stats for specific jobs and try to identify
bottlenecks, areas for optimization, etc.

This approach could also allow the web frontend to run on a different
host from gmetad.  The only reason they have to be on the same host now
is because the web frontend needs to call rrdtool to generate graphs (I
think).  It might also be useful for implementing caching since all the
caching logic could be hidden behind the single graph.php file.

But I haven't thought all those ideas completely through, so there may
be drawbacks.

-- 
Rick Mohr
Systems Developer
Ohio Supercomputer Center


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Ganglia-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/ganglia-developers

Reply via email to