From: "Kevin Smith" <[EMAIL PROTECTED]> > I'm "considering" making a huge table, by that I mean approx. 26 millions > records which will be static data, no updates, or deletes, etc. will be > performed on this table, only select statements. ... > At a guess the raw data may be in the region of 9GB. > > My question is, if I was to do a search on the Post Code (never on any > address fields) what sort of performance can I expect from a Dual Zeon > 933Mhz CPUs running Windows 2000 Pro Server with 512MB RAM? > > This is based on the fact the the post code field will be indexed and > perhaps an additional index that indexes the first two characters of the > post code and therefore narrows down the search to specific records to begin > with, ie. OX16 0TH, index this and then create an index with the characters > OX...
I'm curious as to why you want to make more than 1 index. Your main bottleneck is (IMO) your RAM requirements. A single index on this is going to be around 200-300M. With additional caching and 2000's memory requirements you're squeezed for space. It'd be easier to advise if we knew what sort of queries are going to be run, and how many concurrant users, etc. Regards, Russ. --------------------------------------------------------------------- Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail <[EMAIL PROTECTED]> To unsubscribe, e-mail <[EMAIL PROTECTED]> Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php