>Please correct me if I am wrong, but much of this discussion leads me to 
>believe that we are saying that there is essentially no reasonable way to 
>predict or model performance for a given application process.

I have been a performance/capacity analyst for 27 years, and worrying about CPU 
has become moot.
Yes, there are some examples of sub-optimal algorithms; over the last 40+ years 
CPU reduction rarely impacts performance any more (IMO).

I/O and resource contention are the biggest/heaviest hitters.
Optimise those and you'll get a bigger bang for your effort.

I'm constantly amazed at the number of files out there with 'bad' block-sizes, 
no buffering, etc.
These will be a better place to spend your time.

Resource contention is either caused by a mis-match of shared versus exclusive 
intent, or scheduling.
Again, you can do better if you spend your effort here (again, imo).

If you cannot show a CPU savings outside the band-width of the variance of 
measurement, then I posit that you are likely wasting time and effort.

-
Too busy driving to stop for gas!

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to