Brian Herlihy wrote:
We have a problem with some of our query plans.  One of our
tables is quite volatile, but postgres always uses the last
statistics snapshot from the last time it was analyzed for query
planning.  Is there a way to tell postgres that it should not
trust the statistics for this table?  Basically we want it to
assume that there may be 0, 1 or 100,000 entries coming out from
a query on that table at any time, and that it should not make
any assumptions.>

I had a similar problem, and just changed my application to do an analyze 
either just before the query, or just after a major update to the table.  
Analyze is very fast, almost always a orders of magnitude faster than the time 
lost to a poor query plan.

Craig

--
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to