On 3/17/14, 5:07 PM, Claudio Freire wrote:

On Mon, Mar 17, 2014 at 7:01 PM, Jim Nasby <j...@nasby.net 
<mailto:j...@nasby.net>> wrote:

    Even better would be if the planner could estimate how bad a plan will 
become if we made assumptions that turn out to be wrong.

That's precisely what risk estimation was about.

Something like

SELECT * FROM wherever WHEN id > something LIMIT COST 10000;

Would forbid a sequential scan *if* the table is big enough to suspect the plan 
might take that much, or a nested loop *if* the planner cannot *prove* it will 
be faster than that.

I don't believe the limit unit is obscure at all (page fetches being a nice 
measuring stick), but what is, is what do you do when no plan fits the limits.

I don't think that's the same thing... what you're describing is a way to not 
begin a query if a low-enough cost plan can't be found.

What I'm talking about is when the planner picks one low-cost plan over another 
and it turns out the estimate of the one that was picked was WAY off. I've 
actually seen cases where plan estimates that were off by just 100 units 
produce wildly different results.

In that scenario, LIMIT COST won't help at all.
Jim C. Nasby, Data Architect                       j...@nasby.net
512.569.9461 (cell)                         http://jim.nasby.net

Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:

Reply via email to