Quoting Christopher Kings-Lynne <[EMAIL PROTECTED]>:

> > Another trick you can use with large data sets like this when you
> want 
> > results
> > back in seconds is to have regularly updated tables that aggregate
> the data
> > along each column normally aggregated against the main data set.
> > Maybe some bright person will prove me wrong by posting some
> working
> > information about how to get these apparently absent features
> working.
> Most people just use simple triggers to maintain aggregate summary
> tables...

Don't know if this is more appropriate to bizgres, but:
What the first poster is talking about is what OLAP cubes do.

For big aggregating systems (OLAP), triggers perform poorly, 
compared to messy hand-rolled code. You may have dozens
of aggregates at various levels. Consider the effect of having 
each detail row cascade into twenty updates. 

It's particularly silly-looking when data is coming in as 
batches of thousands of rows in a single insert, e.g.

   COPY temp_table FROM STDIN;
   UPDATE fact_table ... FROM ... temp_table
   INSERT INTO fact_table ...FROM...temp_table

   (the above pair of operations is so common, 
    Oracle added its "MERGE" operator for it).

Hence my recent post (request) for using RULES to aggregate 
--- given no luck with triggers "FOR EACH STATEMENT".

---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend

Reply via email to