At 10:27 AM 8/27/2005, Tom Lane wrote:
It certainly seems common in the EXPLAIN ANALYZE output I see that
the (estimated) cost of Nested Loop is far higher than the actual
Arjen van der Meijden <[EMAIL PROTECTED]> writes:
> But appareantly there is a bug in the explain mechanism of the 8.1devel
> I'm using (I downloaded a nightly 25 august somewhere in the morning
> (CEST)), since it returned:
> ERROR: bogus varno: 9
Yeah, someone else sent in a test case for this failure (or at least one
with a similar symptom) yesterday. I'll try to fix it today.
> Is a nested loop normally so much (3x) more costly than a hash join? Or
> is it just this query that gets estimated wronly?
There's been some discussion that we are overestimating the cost of
nestloops in general, because we don't take into account that successive
scans of the inner relation are likely to find many pages already in
cache from the earlier scans. So far no one's come up with a good cost
model to use for this, though.
regards, tom lane
What happened when someone tried the naive approach of telling the
planner to estimate the cost of a nested loop based on fitting
whatever entities are involved in the nested loop in RAM as much as
possible? When there are multiple such mappings, use whichever one
results in the lowest cost for the NL in question.
Clearly, this should lead to an underestimate of the cost of the
constant of operation involved, but since nested loops have the only
polynomial growth function of the planner's choices, NL's should
still have a decent chance of being more expensive than other choices
under most circumstances.
In addition, if those costs are based on actual measurements of how
long it takes to do such scans then the estimated cost has a decent
chance of being fairly accurate under such circumstances.
It might not work well, but it seems like a reasonable first attempt
at a solution?
---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?