I'm currently designing a rework of a DB from a speed point of view, we are
using Interbase.  The bone of contention at the moment is that we have a
datalogging table where each unique key represents a system being monitored.
For each system/unique key there will be tens of thousands of records giving
hundreds of thousands, even millions, of records in total for the table.
Via the user interface the user only ever sees one systems data at any one
time, so whenever I run a query over the table it must include a WHERE
clause filtering out all records belonging to the other monitored systems.  

My question being from a speed point of view would it be quicker to give
each system its own table rather than lumping them all together.  Thus
removing the need for the WHERE clause meaning the query would be just a
straight dump of the data?  Now obviously it would be quicker but how much
quicker? and would it be noticable?  I don't know much about Interbase's
inner workings and for all I know Interbase could effectivly be doing it
behind my back anyway.


Thanks in advance...



Nahum Wild

Software developer type person
Invensys Energy Systems (NZ) Limited.
(formally Swichtec Power Systems)
---------------------------------------------------------------------------
  New Zealand Delphi Users group - Database List - [EMAIL PROTECTED]
                  Website: http://www.delphi.org.nz

Reply via email to