On 12/7/15 10:44 AM, Simon Riggs wrote:
There are many optimizations we might adopt, yet planning time is a
factor. It seems simple enough to ignore more complex optimizations if
we have already achieved a threshold cost (say 10). Such a test would
add nearly zero time for the common case. We can apply the optimizations
in some kind of ordering depending upon the cost, so we are careful to
balance the cost/benefit of trying certain optimizations.

Unfortunately I've seen a lot of millisecond queries that have 6 figure estimates, due to data being in cache. So I'm not sure how practical that would be.

Maybe a better starting point would be a planner timeout.

I definitely agree we need some method to limit planning time when necessary (ie: OLTP). Without that we'll never be able to start testing more complex optimizations.
Jim Nasby, Data Architect, Blue Treble Consulting, Austin TX
Experts in Analytics, Data Architecture and PostgreSQL
Data in Trouble? Get it in Treble! http://BlueTreble.com

Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:

Reply via email to