Ok, the trick of all of this is that the data all comes from some MapInfo source files. Actually they originate in an application call PlanetEV that does grid analysis. I'm converting this data to contour shapefiles. I'd have to go in and change about 2+ GBs of data. This is data that needs to be updated manually every month. I want to try and make my update process as quick as possible, however I do understand the need to make sure the data is in the best possible format for server performance.
On Tue, 7 Aug 2007 13:08:00 -0500, Fawcett, David <[EMAIL PROTECTED]> wrote: >Minimizing processing at runtime can only help. > >If you have the disk space and your data is static, you could pre-create >different layers based on your expression criteria, so that your data >doesn't have to be evaluated in five different layers. > >I don't know if it is still true, but regular expressions used to be >faster than logical expressions, so you could add another column, >pre-classify your data and store a class value in that column, something >like a single-digit integer from 1-5. That might evaluate faster than a >complex logical expression.
