Christopher Browne <[EMAIL PROTECTED]> writes:
> Are you certain it's a linear system?
If you just consider the guc parameters that tell postgres how long various
real world operations take (all the *_cost parameters) then it's a linear
system. It has to be. The resulting time is just a sum of the times for some
number of each of these real world operations.
If you include parameters like the geqo_* parameters or the hypothetical
parameter that controls what selectivity to assume for clauses with unknown
selectivity then no, it wouldn't be.
But if you assume the estimated row counts are correct and you're just trying
to solve for the parameters to come up with the most accurate cost for the
current hardware then I think you're golden.
> There might well be some results to be gotten out of a linear
> approximation; the Grand Challenge is to come up with the model in the
> first place...
Indeed. The model's not perfect now of course, and it'll never really be
perfect since some of the parameters represent operations that aren't always a
consistent cost. But you should be able to solve for the values that result in
the most accurate totals the most often. There may be some tradeoffs (and
therefore new guc variables :)
It occurs to me that there's no reason to use the unreliable EXPLAIN counts of
the costs. You may as well account accurately for them and use the actual
values used in performing the query. This means there's no reason to discard
inaccurately estimated data points.
Moreover, the overhead issue a non-issue. Since you only need the total time,
and the total costs. You would have the overhead of performing lots of
increments on those costs, but you only have to do two gettimeofdays. Once at
the beginning and once at the end.
---------------------------(end of broadcast)---------------------------
TIP 3: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to [EMAIL PROTECTED] so that your
message can get through to the mailing list cleanly