In some basic tests that I performed directly graphing the an RRD by
command line, I notice that graphs that use the AVERAGE Data Point will
show fluctuation in the max listing. If you create a graph using the MAX
data point, the max value should stay the same as you increase the time
span.

The default RRD database settings have different recording
characteristics for displaying graphs for data hourly, weekly, monthly
and yearly (These are the default RRA definitions in your performance
monitor). Let's focus on the weekly display. By default, Zenoss is
configured to say, collect 24 samples (two hours at 5 minute intervals)
and record the AVERAGE of the sample set and the MAXIMUM value of that
sample set.

The RRDGraph command will attempt to select the RRA Data set that most
closely matches the resolution you have chosen to display. Normally,
your zoom will stay with the same data set, until the time frame exceeds
the amount available in the data set, or the when the resolution of
another data set closely matches that of the graph. 

So if your graph is 350 pixels wide and you are displaying a weeks worth
of results, it will select the RRA that provides a minimum of 50 data
points per day and holds a weeks worth of data. If you display two weeks
of data, to keep the same resolution, you would need to double the width
of the graph, or if you keep the same pixel width, it could choose a RRA
data set that only has 25 data points per day. Additionally, the
resolution for the height also comes into play in selecting which RRA
data set to use. 

Now say that you use an RRA data set with a short frequency (I.E. 5
minutes per record). And you are displaying a long period of time. If
you hit the point where there are two data points available for each
pixel, guess what happens. The first two values get paired and averaged.
Then the next two. So if you have a max value in one data point, it gets
watered down as it is averaged with the next data point.

You can try to take the height value out of the decision about which RRA
data set to choose by setting the max and minimum value to display in
the graph. This will mean that the Y-Axis scale will not be auto-scaled
and variable (because Zenoss should include the --rigid flag in the
graph command). But the only real way to be sure that the max does not
get averaged would be to include enough RRA statements to cover the most
common viewing time frames. The default RRA statements were determined
with a 5 minute step in mind. If you shorten the SNMP cycle time in your
performance monitor you should adjust the RRA statements to be more
appropriate.


On Thu, 2007-07-12 at 19:02 +0000, agthurber wrote:
> I noticed when you zoom out on a graph that it seems to average out the data 
> and you loose the highest values. Is there any way to force the graphs to 
> retain the max values?
> 
> ------------------------
>  A. G. Thurber
> 
> 
> 
> 
> -------------------- m2f --------------------
> 
> Read this topic online here:
> http://community.zenoss.com/forums/viewtopic.php?p=8672#8672
> 
> -------------------- m2f --------------------
> 
> 
> 
> _______________________________________________
> zenoss-users mailing list
> [email protected]
> http://lists.zenoss.org/mailman/listinfo/zenoss-users
-- 
James D. Roman
IT Network Administration

Terranet Inc.On contract to:
Science Systems and Applications, Inc.

_______________________________________________
zenoss-users mailing list
[email protected]
http://lists.zenoss.org/mailman/listinfo/zenoss-users

Reply via email to