Lookin at the code and following the values during a reset. How does it calculate the interval that had the reset occur during it. I thought the interval would go NAN since the delta should be negative. But it looks like it calculates a value.
I take it the DERIVE is the right thing to do and I believe in my mind it is correct. I just would like to be able to explain why it is correct to my boss and everyone else that asks. Any takers? Christian Pearce -PacketPusher On Fri, 26 Jan 2001, Paul Wickman wrote: > > > Use DERIVE instead of COUNTER. That, in combination with rrd-min=0 will > eliminate the "reset effect". You will loose the offending sample, but in > my (and many other people's) case, it's a worthty tradeoff. > > -- Unsubscribe mailto:[EMAIL PROTECTED] Help mailto:[EMAIL PROTECTED] Archive http://www.ee.ethz.ch/~slist/rrd-users WebAdmin http://www.ee.ethz.ch/~slist/lsg2.cgi
