On Wed, Oct 24, 2012 at 5:40 PM, Merlin Moncure <mmonc...@gmail.com> wrote: > On Wed, Oct 24, 2012 at 3:51 PM, Merlin Moncure <mmonc...@gmail.com> wrote: >> On Wed, Oct 24, 2012 at 3:33 PM, Tom Lane <t...@sss.pgh.pa.us> wrote: >>> Merlin Moncure <mmonc...@gmail.com> writes: >>>> Yeah -- I have a case where a large number of joins are happening that >>>> have a lot of filtering based on expressions and things like that. >>> >>> Might be worth your while to install some indexes on those expressions, >>> if only to trigger collection of stats about them. >> >> Not practical -- these expressions are all about 'outlier culling'. >> It's just wasteful to materialize indexes for stastical purposes only. >> Anyways, in this case, I just refactored the query into a CTE.
Apologies for blabbing, but I was wondering if a solution to this problem might be to have the planner identify low cost/high impact scenarios that would qualify for simply running some of the stored statistical values through qualifying stable expressions, particularly when the input variables are constant or single sourced from a table. Over the years, the planner has been getting very precise in terms of algorithm choice and this is making the costs of statistics misses increasingly dangerous, a trend which I think has been reflected by regression reports on -performance. merlin -- Sent via pgsql-bugs mailing list (pgsql-bugs@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-bugs