Hi, I am still trying to simulate "autogrow" with a thread that checks the database size and adds volumes if required.
I have a complicated query that creates a "cross-tab" result (aka MS Access) for our reporting, but when we get up into millions of records, it seems to take large amounts of space to execute. Here's the query: SELECT Forms.Title AS Form, SUM(CASE WHEN Kiosks.Name='loc1' THEN Metrics.Copies ELSE 0 END) AS "loc1", SUM(CASE WHEN Kiosks.Name='loc2' THEN Metrics.Copies ELSE 0 END) AS "loc2", SUM(CASE WHEN Kiosks.Name='loc3' THEN Metrics.Copies ELSE 0 END) AS "loc3", SUM(CASE WHEN Kiosks.Name='loc4' THEN Metrics.Copies ELSE 0 END) AS "loc4", SUM(CASE WHEN Kiosks.Name='loc5' THEN Metrics.Copies ELSE 0 END) AS "loc5", SUM(Metrics.Copies) AS Total FROM Metrics INNER JOIN Kiosks ON Metrics.Kiosk_Id = Kiosks.Id INNER JOIN Forms ON Metrics.Form_Id = Forms.Id WHERE ExecTime IS NOT NULL AND Copies IS NOT NULL GROUP BY Forms.Title ORDER BY UPPER(Forms.Title) The more locations we have, the more sum(case...) clauses we have. Does anyone know any way around this?! Is there a way to optimize the query? (I realise that adding an UpperTitle field would help). Also, is it still possible/necessary to set the MAXDATAPAGES parameter? I get an error whenever I try to. Cheers, David _______________________________________________ sapdb.general mailing list [EMAIL PROTECTED] http://listserv.sap.com/mailman/listinfo/sapdb.general
