> Alex van den Bogaerdt-5 wrote
>> You mean you want to see the same bars, only reduced in width? In other
>> words, get smaller bars with gaps inbetween?
> That would also work visually, but what I had in mind was not to reduce
> the
> width, but create the bars by separating them with vertical lines of the
> correct height.

I have difficulty understanding what you describe here.

>> If you'd have step sizes smaller than 3600 seconds and measure more
>> often
>> than every 3600 seconds, you would end up will all gas used during the
>> last of each step, and zero in the others. Definately not want you want.
> Wouldn't this actually create the desired bars in a more easy way? I will
> try...

Suppose you have steps of 15 minutes. 3 steps the gas consumption is 0
kuub, then for one step there is a rate, then 3 steps zero again, et

When graphing a day, you would graph 96 bars, 72 of them being zero. 24
bars would show all gas consumption but 4 times as high as you would

This can be corrected with a CDEF, no problem.

But... what happens when you are going to display your data for a week? 
For a month?  For a year?  This is where consolidation kicks in. Which
brings us back to the next part:

>> Write down 24 consecutive rates. They will differ during the day.
>> How do you want to combine these 24 rates into 1 ?
>> Hint: you do not want the last of these 24 (but you are doing that now:
>> LAST CF).
>> Hint: you do not want the highest of these 24 (but that would happen
>> when
>> using MAX CF).
>> Once more: surface, not rate.
> I can see where you are going with this. I want the total surface of 24
> bars to be the same as the surface for the 1 bar. So AVERAGE would be the
> correct
> CF choice here, because only that CF would give me the same surface.  I
> will try this.


> I could (or maybe even should) still use LAST for the highest resolution
> (one hour) in this case right? As it's one PDP or are all my 10 second
> tries
> to store the value also CF'ed?

Phase one: converting the input number to a rate.

Your step size is 3600. You can update multiple times within this one
step, no problem. The result is still one step.

Normalizing happens here. If/when you update with timestamps which are not
a whole multiple of the step size, this may produce surprising results if
you are unaware of normalizing.

Consolidation did not yet happen, but you do now have a normalized rate.

One update can generate more than one step. This could happen for instance
if you update every two hours. You have two (or more) separate steps with
independent normalized rates.

Next phase: storing the rates in the RRD. This is where consolidating

If there is only one step, then consolidation is a no-op. First, last,
min, max, they are all the same.

min(2) = max(2) = average(2) = last(2) = 2

The consolidation process is important when more than one step is turned
into one bigger step, e.g. 4 times one hour becomes one time 4 hours.

min(4,0,1,3) = 0
max(4,0,1,3) = 4
average(4,0,1,3) = 2
last(4,0,1,3) = 3

This can happen if you have RRAs which store more than one step ("step")
per bucket ("steps"). It can also happen at graph time, where the
available RRAs do not make a perfect match with the graph time.

> I do need to read the tutorials again, because I am getting all these
> questions about stuff I thought I understood.

shameless plug of own site:

The new RRDtool may do things somewhat differently, but the basics should
still apply.

rrd-users mailing list

Reply via email to