A statistic target of 400 fir a specific column may make sense but even then I would recommend monitoring performance to ensure it doesn't cause problems. As a global setting it's, IMHO, ridiculous.

Even for the smaller data types (except boolean and "char") and array of 400 will be large enough to be toasted. Planning queries will involve many more disk I/Os than some of those queries end up taking themselves. Even for stats which are already cached there are some algorithms in the planner known to be inefficient for large arrays.

It may make sense for specific skewed columns with indexes on them, but keep in mind postgres needs to consult the statistics on any column referenced in a qual even if there are no indexes and for most data distributions do fine with a target of 10.

I think we all agree the default may need to be raised but until there is some data we have little basis to recommend anything specific.

I would suggest starting from the basis that "mixed" (with a conservative memory setting) is the same as "Postgres default". Perhaps (probably) the defaults should be changed but we shouldn't have two different tools with different (drastically different!) ideas for the same situation.

greg

On 13 Nov 2008, at 07:46 PM, Josh Berkus <[EMAIL PROTECTED]> wrote:

Gregory Stark wrote:
Josh Berkus <[EMAIL PROTECTED]> writes:
DW:
   default_statistics_target = 400
Mixed:
   default_statistics_target = 100
You, my friend, are certifiably insane.

Hmmm? Why? I've used those settings in the field, fairly frequently. I was actually wondering if we should raise the default for web as well, but decided to let it alone.

Actually, I think a DW should begin at 400; often it needs to go up to 1000, but I don't think a script should do that.

--Josh


--
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to