On Friday 05 December 2008 00:05:34 Robert Haas wrote:
> On Thu, Nov 27, 2008 at 6:46 PM, Gregory Stark <[EMAIL PROTECTED]> 
wrote:
> >> ANALYZE with default_statistics_target set to 10 takes 13 s.  With
> >> 100, 92 s.  With 1000, 289 s.
> >
> > That is interesting. It would also be interesting to total up the time it
> > takes to run EXPLAIN (without ANALYZE) for a large number of queries.
>

I wonder if we'd see anything dramatically different using PREPARE... 

> OK, I did this.  I actually tried 10 .. 100 in increments of 10 and
> then 100 ... 1000 in increments of 50, for 7 different queries of
> varying complexity (but all generally similar, including all of them
> having LIMIT 100 as is typical for this database).  I planned each
> query 100 times with each default_statistics_target.  The results were
> somewhat underwhelming.
>

The one thing this test seems to overlook is at what point do we see 
diminshing returns from increasing dst. I think the way to do this would be 
to plot dst setting vs. query time; Robert, do you think you could modify 
your  test to measure prepare time and then execute time over a series of 
runs? 

-- 
Robert Treat
Conjecture: http://www.xzilla.net
Consulting: http://www.omniti.com

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to