I'm trying to improve the speed of suite of queries that go across a few 
million rows.

They use 2 main "filters" across a variety of columns:

        WHERE (col_1 IS NULL ) AND (col_2 IS NULL) AND ((col_3 IS NULL) OR 
(col_3 = col_1))
        WHERE (col_1 IS True ) AND (col_2 IS True) AND (col_3 IS True) OR 
(col_4 IS NULL)

I created a dedicated multi-column index for each query to speed them up.  That 
was great.

I still don't have the performance where I want it to be - the size of the 
index seems to be an issue.  If the index were on one column, instead of 4, I 
think the scans would complete in time.

i looked online and the archives, and couldn't find much information on good 
strategies to deal with this.

It looks like my best option is to somehow index on the "interpretation" of 
this criteria, and not the criteria itself.

the two ways that come to mind are:

        1. alter the table: adding a boolean column for each filter-test to the 
table, index that, then query for that field
        2. leave the table as-is: write a custom function for each filter, and 
then use a function index 

has anyone else encountered a need like this?

are there any tips / tricks / things I should look out for.  are there better 
ways to handle this?

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to