Tables are analyzed, though I would love to find a way to increase it's
accuracy of statistics
Tried raising the statistics target upto 100, but it did not help. Should I
bump it even more
However I found that if I add depth to the group by clauses, it somehow
tells the optimizer that it would get
"Virag Saksena" <[EMAIL PROTECTED]> writes:
The individual queries run in 50-300 ms. However the optimizer is
choosing a nested loop to join them rather than a Hash join...
I have what appears to be the identical problem.
This is a straightforward query that should be fairly quick, but takes a
"Virag Saksena" <[EMAIL PROTECTED]> writes:
> The individual queries run in 50-300 ms. However the optimizer is =
> choosing a nested loop to join them rather than a Hash join
> causing the complete query to take 500+ seconds. It expects that it will =
> get 1 row out from each of the sources, but
Hi,
I have query where I do two
inline queries (which involves grouping) and then join them with an outer
join.
The individual queries run in 50-300 ms. However
the optimizer is choosing a nested loop to join them rather than a Hash
join
causing the complete query to take 500+ seconds. I
PFC wrote:
I'm developing a search engine using the postgresql's databas. I've
already doing some tunnings looking increase the perform.
Now, I'd like of do a realistic test of perfom with number X of queries
for know the performance with many queries.
What the corret way to do this?
I'm developing a search engine using the postgresql's databas. I've
already doing some tunnings looking increase the perform.
Now, I'd like of do a realistic test of perfom with number X of queries
for know the performance with many queries.
What the corret way to do this?
I guess the only
Fredrik Olsson <[EMAIL PROTECTED]> writes:
> Is a self contained test-case for the old way with the "proxy-view" is
> still wanted?
Yes, something still seems funny there. And for that matter it wasn't
clear what was wrong with your proxy view, either.
regards, tom lane
Hi,
I'm developing a search engine using the postgresql's databas. I've
already doing some tunnings looking increase the perform.
Now, I'd like of do a realistic test of perfom with number X of queries
for know the performance with many queries.
What the corret way to do this?
Thanks.
---
Tom Lane skrev:
Fredrik Olsson <[EMAIL PROTECTED]> writes:
-> Seq Scan on t_entities (cost=0.00..1.49 rows=7 width=4)
(actual time=404.539..409.302 rows=2 loops=1)
Filter: ((haveaccess(createdby, responsible, "class",
false) OR CASE WHEN (partof = 'contacts'::name) THE