Title: Message
Tom,
    Not a good idea. Since new tables can be added/dropped and the queries from the data dictionary views can be poor performers, you really don't get a good idea of the actual speed of operation. The execution of this query once every 5 minutes will have a negative impact on performance. Kind of like running out in to traffic every 5 minutes, stopping a passing car and asking how long it took to go from the last intersection to here. And my mother taught me never to run out in to traffic!
 
    Why not query the memory each day/half-day to find the 10 highest consumers of resource? or 10 longest running queries? You can kill 2 birds with 1 stone. By tracking similar queries, you can note the change in resource/time. At the same time, you identify the most expensive sql and thus the 10 best candidates for tuning!
 
Dan Fink
-----Original Message-----
From: Terrian, Tom (Contractor) (DAASC) [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, January 15, 2003 10:54 AM
To: Multiple recipients of list ORACLE-L
Subject: Database tracking

All, I would like to track the performance of my production databases by running the same SQL statement against each database every 5 minutes or so and recording the results.  For example:
sql> set timing on;
sql> select count(*) from dba_tables;
 
That was I would know if they are getting faster or slower over time.  As anyone already done this?  Would there be a good SQL statement to use?
 
Thanks,
Tom Terrian
 

Reply via email to