Hi all,

For the new Blur Console I am trying to add a graph that will display the
average query time of all of the queries per minute.  Currently I am
looping over all of the queries, and collecting the real time from the
CpuTimes object and then averaging the numbers.  I don't think that I am
taking the correct approach to this, due to the fact that even with a
single query the difference between what I am calculating and what is
displayed in the shell is way off.  For instance, I ran one query that the
shell stated took 123ms and my calculation came back with 3.5ms.  I do know
that the CpuTimes items are nanoseconds so I am doing the conversion to ms
(and I did verify that I have the correct number of zeros).

Thanks for your help,
Chris

Reply via email to