John
My initial tests indicated that not all tables which had any DMLs were
getting into the'stale category'. I checked the ratioA/B
where:
A is the sum of inserts+deletes+updates from user_tab_modifications
B is num_rows from user_tables (or user_tab_partitions).
I observed that if this
We have a data-warehouse that is a combination of Snapshots and table-builds
based on the snapshots.
The table builds run at 4:30 am, scripts are setup to start the snapshots at
7:00am and end
at 9:00pm. At 6:30 am a script performs an analyze on ALL (except sys and
system) tables in
the
Statistics become old after a single change is made to the object.
Granted adding a single row to a 1 Million row table isn't going to effect
much. But adding 100,000 rows will. So where do you draw the line?
I really don't know. But 5%-10% is generally a good measure.
Sometimes more
Which version of Oracle are you using. In 8i you can
set 'monitoring on' for the tables and use dbms_stats to analyze stale. (Though,
I am getting error while using dbms_stats for the partitioned tables. So I have
made a home made version to analyze stale). As per my calculations, package
Anand,
Just curious : Is there some test or other observation that you can share
with the list about dbms_stats using 10% as a boundary for staleness?
John Kanagaraj
Which version of Oracle are you using. In 8i you can set 'monitoring on' for
the tables and use dbms_stats to analyze