Re: Avoiding out of date statistics / planner
That looks very useful indeed. Thanks Tomas On Wed, Feb 12, 2020 at 8:32 PM Tomas Vondra wrote: > On Wed, Feb 12, 2020 at 10:23:22AM -0700, Michael Lewis wrote: > >It may also be worth noting that it is possible to make autovacuum/analyze > >more aggressive, perhaps only on the tables that see large changes in data > >that might result in a statistics issue. If you could share a query, > >explain analyze output, and pseudo code or at least description of what > >sort of bulk operations are being done, then more insight could be > offered. > > Another thing you can do is deploy auto_explain, and log explain plan > for long-runnning queries. That won't fix the root cause, but it will > help you with confirming the root cause - you'll see the query plan, > which should give you enough context. > > regards > > -- > Tomas Vondra http://www.2ndQuadrant.com > PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services >
Re: Avoiding out of date statistics / planner
On Wed, Feb 12, 2020 at 10:23:22AM -0700, Michael Lewis wrote: It may also be worth noting that it is possible to make autovacuum/analyze more aggressive, perhaps only on the tables that see large changes in data that might result in a statistics issue. If you could share a query, explain analyze output, and pseudo code or at least description of what sort of bulk operations are being done, then more insight could be offered. Another thing you can do is deploy auto_explain, and log explain plan for long-runnning queries. That won't fix the root cause, but it will help you with confirming the root cause - you'll see the query plan, which should give you enough context. regards -- Tomas Vondra http://www.2ndQuadrant.com PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services
Re: Avoiding out of date statistics / planner
It may also be worth noting that it is possible to make autovacuum/analyze more aggressive, perhaps only on the tables that see large changes in data that might result in a statistics issue. If you could share a query, explain analyze output, and pseudo code or at least description of what sort of bulk operations are being done, then more insight could be offered.
Re: Avoiding out of date statistics / planner
Tim Kane writes: > Every now and again, I will encounter an unexplained long-running query. > It’s a head scratcher moment, because this query that is still running for > 20 minutes (not blocking) can be run independently in about 500ms Without some kind of context (like, have you been doing something to the table(s) involved that would drastically change their statistics) it's hard to comment on this. It's not obvious from the info provided that this is a bad-plan issue rather than something else. > On the application side, we can explicitly issue a VACUUM ANALYZE after > each bulk operation - and often that is precisely what happens.. > But - I am keenly aware that this cannot be performed within a transaction. Plain ANALYZE can be, and that's all you need if the problem is to update stats. regards, tom lane