I saw it in print; the only thing that seemed interesting about it was
the recommendation that query optimization be biased towards the
notion of "stable plans," query plans that may not be the most
"aggressively fast," but which don't fall apart into hideous
performance if the estimates are a little bit off.
And the answer is interesting as well:
"I think we have to approach it in two ways. One is that you have to be
able to execute good plans, and during the execution of a plan you want
to notice when the actual data is deviating dramatically from what you
expected. If you expected five rows and you’ve got a million, chances
are your plan is not going to do well because you chose it based on the
assumption of five. Thus, being able to correct mid-course is an area of
enhancement for query optimizers that IBM is pursuing."
Hmmm dynamic re-planning!
Chris
---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend