On Tue, Mar 22, 2005 at 08:09:40AM -0500, Christopher Browne wrote:
> Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark)
> wrote:
> > I don't think it would be very hard at all actually.
> >
> > It's just a linear algebra problem with a bunch of independent
> > variables and
[EMAIL PROTECTED] ("Dave Held") writes:
>> -Original Message-
>> From: Tom Lane [mailto:[EMAIL PROTECTED]
>> Sent: Tuesday, March 22, 2005 3:48 PM
>> To: Greg Stark
>> Cc: Christopher Browne; pgsql-performance@postgresql.org
>> Subject: Re: [PE
Tom Lane wrote:
And you can't just dismiss the issue of wrong cost models and say we can
get numbers anyway.
Is there a way to see more details about the cost estimates.
EXPLAIN ANALYZE seems to show the total time and rows; but not
information like how many disk pages were accessed.
I get the feel
> -Original Message-
> From: Dave Held
> Sent: Tuesday, March 22, 2005 4:16 PM
> To: Tom Lane
> Cc: pgsql-performance@postgresql.org
> Subject: Re: [PERFORM] What about utility to calculate planner cost
> constants?
> [...]
> Then instead of building a fixed
> -Original Message-
> From: Tom Lane [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, March 22, 2005 3:48 PM
> To: Greg Stark
> Cc: Christopher Browne; pgsql-performance@postgresql.org
> Subject: Re: [PERFORM] What about utility to calculate planner cost
> constants?
>
Greg Stark <[EMAIL PROTECTED]> writes:
> The time spent in real-world operations like random page accesses, sequential
> page accesses, cpu operations, index lookups, etc, are all measurable
> quantities. They can be directly measured or approximated by looking at the
> resulting net times.
That's
Tom Lane <[EMAIL PROTECTED]> writes:
> Christopher Browne <[EMAIL PROTECTED]> writes:
> > Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark)
> > wrote:
> >> It's just a linear algebra problem with a bunch of independent
> >> variables and a system of equations. Solving for
Greg Stark <[EMAIL PROTECTED]> writes:
> Christopher Browne <[EMAIL PROTECTED]> writes:
>> Are you certain it's a linear system?
> If you just consider the guc parameters that tell postgres how long various
> real world operations take (all the *_cost parameters) then it's a linear
> system. It
Christopher Browne <[EMAIL PROTECTED]> writes:
> Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark)
> wrote:
>> It's just a linear algebra problem with a bunch of independent
>> variables and a system of equations. Solving for values for all of
>> them is a straightforward p
Greg Stark wrote:
Richard Huxton writes:
You'd only need to log them if they diverged from expected anyway. That should
result in fairly low activity pretty quickly (or we're wasting our time).
Should they go to the stats collector rather than logs?
I think you need to log them all. Otherwise when
Christopher Browne <[EMAIL PROTECTED]> writes:
> Are you certain it's a linear system?
If you just consider the guc parameters that tell postgres how long various
real world operations take (all the *_cost parameters) then it's a linear
system. It has to be. The resulting time is just a sum of
Richard Huxton writes:
> You'd only need to log them if they diverged from expected anyway. That should
> result in fairly low activity pretty quickly (or we're wasting our time).
> Should they go to the stats collector rather than logs?
I think you need to log them all. Otherwise when you go to
On Tue, Mar 22, 2005 at 08:09:40 -0500,
Christopher Browne <[EMAIL PROTECTED]> wrote:
>
> Are you certain it's a linear system? I'm not. If it was a matter of
> minimizing a linear expression subject to some set of linear
> equations, then we could model this as a Linear Program for which
> th
Martha Stewart called it a Good Thing when [EMAIL PROTECTED] (Greg Stark) wrote:
> I don't think it would be very hard at all actually.
>
> It's just a linear algebra problem with a bunch of independent
> variables and a system of equations. Solving for values for all of
> them is a straightforward
Greg Stark wrote:
Josh Berkus writes:
That's not really practical. There are currently 5 major query tuning
parameters, not counting the memory adjustments which really can't be left
out. You can't realistically test all combinations of 6 variables.
I don't think it would be very hard at all
On Mon, 21 Mar 2005 14:59:56 -0800, Josh Berkus wrote:
> > If by not practical you mean, "no one has implemented a multivariable
> > testing approach," I'll agree with you. But multivariable testing is
> > definitely a valid statistical approach to solving just such problems.
> Well, not practical
Josh Berkus writes:
> > Otherwise it could just collect statements, run EXPLAIN ANALYZE for all
> > of them and then play with planner cost constants to get the estimated
> > values as close as possible to actual values. Something like Goal Seek
> > in Excel, if you pardon my reference to MS :).
Thomas,
> If by not practical you mean, "no one has implemented a multivariable
> testing approach," I'll agree with you. But multivariable testing is
> definitely a valid statistical approach to solving just such problems.
Well, not practical as in: "would take either $10 million in equipment o
If by not practical you mean, "no one has implemented a multivariable
testing approach," I'll agree with you. But multivariable testing is
definitely a valid statistical approach to solving just such problems.
-tfo
--
Thomas F. O'Connell
Co-Founder, Information Architect
Sitening, LLC
http://www
Tambet,
> I was following the cpu_tuple_cost thread and wondering, if it could be
> possible to make PQA style utility to calculate configuration-specific
> values for planner cost constants. It could make use of output of
> log_(statement|parser|planner|executor)_stats, tough I'm not sure if the
20 matches
Mail list logo