Performance experts,
I am testing a web services server that relies on a SQL server DB as system
of records. The way this data base is , most of the search web services, make
the DB CPU utilization go petty high momentarily (close to 50% or so) on a
single (standalone) query operation
Under load - as I can not predict what operation will run concurrently with
what other operation - I have a huge variance in the execution of these search
operations (any where from the its base time to about 10 times the base time).
Average of such a number also does not make much sense. As a result, as I try
to optimize the DB or other layers, I can not confidently say that performance
is improving or not, as every time I run it, I get different numbers for each
search operation depending on what else is running concurrently with it.
Of course I can run these search operations one at a time (i.e. not run any
thing else when a search is running - in a single thread), but then I can not
see some other issues that I need to observe.
How do you tackle such a situation?
Matt
---------------------------------
Yahoo! Music Unlimited - Access over 1 million songs. Try it free.