Ragnar Hafstaš wrote:

it is not rational to have random_page_cost < 1.

I agree, in theory one should never *need* to set it < 1. However in
cases when the optimizers understanding of things is a little off,
compensation may be required to achieve better plans (e.g. encouraging
index scans on data with funny distributions or collelations).

if you see improvement with such a setting, it is as likely that something else is wrong, such as higher statistic targets needed,
or a much too low effective_cache setting.

Altho this is good advice, it is not always sufficient. For instance I
have my effective_cache_size=20000. Now the machine has 512Mb ram and
right now cache+buf+free is about 100M, and shared_buffers=2000. So in
fact I probably have it a bit high :-).

Increasing stats target will either make the situation better or worse -
a better sample of data is obtained for analysis, but this is not
*guaranteed* to lead to a faster execution plan, even if in
general/usually it does.



---------------------------(end of broadcast)--------------------------- TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]

Reply via email to