Another trick you can use with large data sets like this when you want results
back in seconds is to have regularly updated tables that aggregate the data
along each column normally aggregated against the main data set.
Maybe some bright person will prove me wrong by posting some working information about how to get these apparently absent features working.
Most people just use simple triggers to maintain aggregate summary tables...
Chris
However, if (insert) triggers prove to be too much of a performance hit, try cron'd functions that perform the aggregation for you. This system works well for us, using the pk's (sequence) for start and stop points.
-- _______________________________
This e-mail may be privileged and/or confidential, and the sender does not waive any related rights and obligations. Any distribution, use or copying of this e-mail or the information it contains by other than an intended recipient is unauthorized. If you received this e-mail in error, please advise me (by return e-mail or otherwise) immediately. _______________________________
---------------------------(end of broadcast)--------------------------- TIP 3: if posting/reading through Usenet, please send an appropriate subscribe-nomail command to [EMAIL PROTECTED] so that your message can get through to the mailing list cleanly