Vernon,

> What I stated is my observation on my project with over twenty multivalued 
detail tables. I have a selection query 
> contained 200 characters, involving 10 tables, and using subquery. The 
performance is not bad after properly indexing, 
> least than 3 second (what the planner says). I will have longer queries 
later and hope they won't have any performance 
> problem.

One trick for you is to create a custom aggregate for string contination for 
each detail table, and that will allow you to list the values in the detail 
table as if they were a continuous text string.   A concat aggregate is even 
fast on PostgreSQL.


CREATE FUNCTION "comma_cat" (text,text) RETURNS text AS '
SELECT CASE WHEN $2 IS NULL OR $2 = '''' THEN $1
        WHEN $1 IS NULL OR $1 = '''' THEN $2
        ELSE $1 || '', '' || $2
        END
' LANGUAGE 'sql';

CREATE FUNCTION "br_cat" (text, text) RETURNS text AS '
SELECT CASE WHEN $2 IS NULL OR $2 = '''' THEN $1
WHEN $1 IS NULL OR $1 = '''' THEN $2
ELSE $1 || ''<br>'' || $2
END
' LANGUAGE 'sql';

--create aggregate with html <breaks> between items
CREATE AGGREGATE br_list ( BASETYPE = text, SFUNC = br_cat, STYPE = text, 
INITCOND = '' );

--create aggregate with commas between items
CREATE AGGREGATE comma_list ( BASETYPE = text, SFUNC = comma_cat, STYPE = 
text,
INITCOND = '' );

-Josh Berkus

---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]

Reply via email to