On Mon, Jun 13, 2016 at 3:40 PM, br...@momjian.us <br...@momjian.us> wrote:

> > > Looking at how the code behaves, it seems custom plans that are _more_
> > > expensive (plus planning cost) than the generic plan switch to the
> > > generic plan after five executions, as now documented.  Custom plans
> > > that are significantly _cheaper_ than the generic plan _never_ use the
> > > generic plan.
> >
> > Yes, that's what the suggested documentation improvement says as well,
> > right?
> Yes.  What is odd is that it isn't the plan of the actual supplied
> parameters that is cheaper, just the generic plan that assumes each
> distinct value in the query is equally likely to be used.  So, when we
> say the generic plan is cheaper, it is just comparing the custom plan
> with the supplied parameters vs. the generic plan --- it is not saying
> that running the supplied constants with the generic plan will execute
> faster, because in fact we might be using a sub-optimial generic plan.
> For example, giving my test table that I posted earlier, if you ran the
> most common constant (50% common) the first five time, the custom plan
> would use a sequential scan.  On the sixth run of that same constant, a
> bitmap scan generic plan would be used.  Now, that does have a lower
> cost, but only for the _average_ distinct value, not for the 50%
> constant that is being used.  A bitmap scan on a constant that would
> normally use a sequential scan will take longer than even a sequential
> scan, because if it didn't, the custom plan would have chosen the bitmap
> scan.
> I am not sure how we can improve things, but I wanted to clarify exactly
> what is happening.

Comparisons on non-uniformly-distributed
columns and specification of non-existent values affects the average
plan cost, and hence if and when a generic plan is chosen

If we are going to be more precise lets do so here as well.  I have, just
reading this, no clue whether having non-uniformity and often searching for
non-existent value will increase or decrease the average plan cost.

I'm still not certain how this is particularly useful.  If we are willing
to draw a conclusion here in what circumstances would I, as an end-user,
want to forgo using a prepared statement and instead dynamically construct
an SQL statement?  Because at this point while this seems like good detail
often times my choice of parameters is influenced by what I consider data
external to the query proper and not any kind of inherent performance
aspect.  I'd consider this advanced usage which doesn't neatly fit into the
SQL Command section of the docs.

David J.

Reply via email to