On Tue, Feb 01, 2005 at 12:06:27AM -0500, Tom Lane wrote:
> "Jim C. Nasby" <[EMAIL PROTECTED]> writes:
> > On Mon, Jan 31, 2005 at 03:26:12PM -0500, Tom Lane wrote:
> >> Preferably a whole lot of queries.  All the measurement techniques I can
> >> think of are going to have a great deal of noise, so you shouldn't
> >> twiddle these cost settings based on just a few examples.
> > Are there any examples of how you can take numbers from pg_stats_* or
> > explain analize and turn them into configuration settings (such and
> > random page cost)?
> Well, the basic idea is to adjust random_page_cost so that the ratio of
> estimated cost to real elapsed time (as shown by EXPLAIN ANALYZE) is the
> same for seqscans and indexscans.  What you have to watch out for is
> that the estimated cost model is oversimplified and doesn't take into
> account a lot of real-world factors, such as the activity of other
> concurrent processes.  The reason for needing a whole lot of tests is
> essentially to try to average out the effects of those unmodeled
> factors, so that you have a number that makes sense within the planner's
> limited view of reality.

Given that, I guess the next logical question is: what would it take to
collect stats on queries so that such an estimate could be made? And
would it be possible/make sense to gather stats useful for tuning the
other parameters?
Jim C. Nasby, Database Consultant               [EMAIL PROTECTED] 
Give your computer some brain candy! www.distributed.net Team #1828

Windows: "Where do you want to go today?"
Linux: "Where do you want to go tomorrow?"
FreeBSD: "Are you guys coming, or what?"

---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?


Reply via email to