>>> "Robert Haas" <[EMAIL PROTECTED]> wrote: > On Wed, Dec 3, 2008 at 4:41 PM, Joshua D. Drake <[EMAIL PROTECTED]> wrote: >> If you are concerned about the analyze time between 10, 50 and 150, I >> would suggest that you are concerned about the wrong things. Remember > > I can't rule that out. What things do you think I should be concerned > about? ISTM that default_statistics_target trades off ANALYZE time > and query planning time vs. the possibility of better plans. If the > former considerations are not an issue for dst = 50, then maybe we > should emit 50 by default. But the limited evidence that has been > published in this forum thus far doesn't support that contention. One more data point to try to help. While the jump from a default_statistics_target from 10 to 1000 resulted in a plan time increase for a common query from 50 ms to 310 ms, at a target of 50 the plan time was 53 ms. Analyze time was 7.2 minutes and 18.5 minutes for targets of 10 and 50. This is an 842 GB database on an 8 processor (3.5 GHz Xeon) machine with 64 GB RAM running (soon to be updated) PostgreSQL 8.2.7. Based on the minimal plan time increase of this test, we're going to try 50 in production and see how it goes. It's worth pondering that at the target of 1000, had we put that into production, running this query 300,000 times per day would have used 21 hours and 40 minutes of additional CPU time per day on planning the runs of this one query, while a target of 50 only consumes an additional 15 minutes of 3.5 GHz CPU time per day. -Kevin
-- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers