-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

I manage a database server for one of our customers, it is MS SQL
Server 2000, we have several tables with over 5 million rows. We
query/join these tables constantly. I have had to rewrite some
queries for performance, and add indexes (indices?) here and there,
but the database performs like Showbear describes his -- blink of an
eye. You really need to pay attention to the execution plan of your
queries, make sure you minimize results before JOIN operations, use
table vars instead of temp tables, don't use the IN clause w/ large
tables -- use a JOIN or EXISTS (JOIN is faster). I learned a lot
about this subject from the book "SQL for Smarties" by Joe Celko, I
would suggest it to anyone who works with an SQL database server. If
your DB is designed properly, and more importantly your queries are
written correctly you should have no problems. If you want to post
your problem schema (SQL CREATE **** statements) and your problem
stored procedures, I would be glad to give you suggestions. As I am
sure David L. Penton and Tore Bostrup would also :-).

Of course, I am assuming your hardware configuration is sufficient to
support MS SQL Server 2000.

Ben Timby
Webexcellence
PH: 317.423.3548 x23
TF: 800.808.6332 x23
FX: 317.423.8735
[EMAIL PROTECTED]
www.webexc.com 

- -----Original Message-----
From: Showbear [mailto:[EMAIL PROTECTED]] 
Sent: Wednesday, July 31, 2002 7:58 AM
To: ActiveServerPages
Subject: RE: Sql Server 2K


Dan, one of our apps has a SQL Server 2000 database in which some
tables
contain over 25 million rows.  Properly indexed, performance is - to
be
scientific - blink-of-an-eye fast.

HTH

- -----Original Message-----
From: Daniel Field [mailto:[EMAIL PROTECTED]] 
Sent: Wednesday, July 31, 2002 10:09 AM
To: ActiveServerPages
Subject: OT: Sql Server 2K


Hi all,

I have a fairly complex DB structure from which I query and return
XML
results.  The structure is working fine, but the query can be a bit
slow
as I have to query several differnt tables (Well comparatively slow,
running about 700-2000 ms).  I have created a lookup table which is a
single table with all the variables in it, the question is, this
table
is currently at 250,000 rows.  I have re-wrote the query to use this
yet, but is it better to do it this way rather than link across the 5
or
so tables?  The row count could easily get close to the 1million mark
in
the future.

Dan



_____________________________________________________________________
This e-mail has been scanned for viruses by the WorldCom Internet
Managed Scanning Service - powered by MessageLabs. For further
information visit http://www.worldcom.com

- ---
You are currently subscribed to activeserverpages as:
[EMAIL PROTECTED] To unsubscribe send a blank email to
%%email.unsub%%



- ---
You are currently subscribed to activeserverpages as: [EMAIL PROTECTED]
To unsubscribe send a blank email to
%%email.unsub%%

-----BEGIN PGP SIGNATURE-----
Version: PGP 7.0.4

iQA/AwUBPUgaPPnby1cCm2Q8EQKpEACfZVY3CdkTTYClb1twJ2YpaHgXjHUAoNyn
7P2ByOaGEY3dcJNBO6sl+pZp
=k0XL
-----END PGP SIGNATURE-----



---
You are currently subscribed to activeserverpages as: [email protected]
To unsubscribe send a blank email to [EMAIL PROTECTED]

Reply via email to