On Tue, Mar 22, 2005 at 08:09:40AM -0500, Christopher Browne wrote:
Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark)
wrote:
I don't think it would be very hard at all actually.
It's just a linear algebra problem with a bunch of independent
variables and a system
On Mon, 21 Mar 2005 14:59:56 -0800, Josh Berkus josh@agliodbs.com wrote:
If by not practical you mean, no one has implemented a multivariable
testing approach, I'll agree with you. But multivariable testing is
definitely a valid statistical approach to solving just such problems.
Well, not
On Tue, Mar 22, 2005 at 08:09:40 -0500,
Christopher Browne [EMAIL PROTECTED] wrote:
Are you certain it's a linear system? I'm not. If it was a matter of
minimizing a linear expression subject to some set of linear
equations, then we could model this as a Linear Program for which
there
Christopher Browne [EMAIL PROTECTED] writes:
Are you certain it's a linear system?
If you just consider the guc parameters that tell postgres how long various
real world operations take (all the *_cost parameters) then it's a linear
system. It has to be. The resulting time is just a sum of
Greg Stark wrote:
Richard Huxton dev@archonet.com writes:
You'd only need to log them if they diverged from expected anyway. That should
result in fairly low activity pretty quickly (or we're wasting our time).
Should they go to the stats collector rather than logs?
I think you need to log them
Christopher Browne [EMAIL PROTECTED] writes:
Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark)
wrote:
It's just a linear algebra problem with a bunch of independent
variables and a system of equations. Solving for values for all of
them is a straightforward problem.
Tom Lane [EMAIL PROTECTED] writes:
Christopher Browne [EMAIL PROTECTED] writes:
Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark)
wrote:
It's just a linear algebra problem with a bunch of independent
variables and a system of equations. Solving for values for
Greg Stark [EMAIL PROTECTED] writes:
The time spent in real-world operations like random page accesses, sequential
page accesses, cpu operations, index lookups, etc, are all measurable
quantities. They can be directly measured or approximated by looking at the
resulting net times.
That's the
-Original Message-
From: Tom Lane [mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 22, 2005 3:48 PM
To: Greg Stark
Cc: Christopher Browne; pgsql-performance@postgresql.org
Subject: Re: [PERFORM] What about utility to calculate planner cost
constants?
[...]
The difficulty
-Original Message-
From: Dave Held
Sent: Tuesday, March 22, 2005 4:16 PM
To: Tom Lane
Cc: pgsql-performance@postgresql.org
Subject: Re: [PERFORM] What about utility to calculate planner cost
constants?
[...]
Then instead of building a fixed cost model, why not evolve
[EMAIL PROTECTED] (Dave Held) writes:
-Original Message-
From: Tom Lane [mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 22, 2005 3:48 PM
To: Greg Stark
Cc: Christopher Browne; pgsql-performance@postgresql.org
Subject: Re: [PERFORM] What about utility to calculate planner cost
Tambet,
I was following the cpu_tuple_cost thread and wondering, if it could be
possible to make PQA style utility to calculate configuration-specific
values for planner cost constants. It could make use of output of
log_(statement|parser|planner|executor)_stats, tough I'm not sure if the
If by not practical you mean, no one has implemented a multivariable
testing approach, I'll agree with you. But multivariable testing is
definitely a valid statistical approach to solving just such problems.
-tfo
--
Thomas F. O'Connell
Co-Founder, Information Architect
Sitening, LLC
Thomas,
If by not practical you mean, no one has implemented a multivariable
testing approach, I'll agree with you. But multivariable testing is
definitely a valid statistical approach to solving just such problems.
Well, not practical as in: would take either $10 million in equipment or
Josh Berkus josh@agliodbs.com writes:
Otherwise it could just collect statements, run EXPLAIN ANALYZE for all
of them and then play with planner cost constants to get the estimated
values as close as possible to actual values. Something like Goal Seek
in Excel, if you pardon my reference
15 matches
Mail list logo