The relation is definitely *NOT* linear.
According to the M/1-M queueing model, given random transaction arrival,
throughput begins to decline at 30% and increases in a non-linear
fashion
Asymptotically(??) approaching inifinity as utilizaion aproaches 100%.
The curve really takes off after about 40% utilization.
Also more of *ix systems run in "supervisor" state than MVS. When I last
Worked in a *ix environment "supervisor ("system") state was about 60-70
percent of the total consumption as opposed to about 10% for z/OS.
In fairness, this does seem to be a difference in the way the beans are
counted. z/OS goes to a fair amount of work to assign the CPU
utilization
To the consuming task. *ix does not do this.
<snip>
Is it that at some very low utilization, delta, the throughput is
epsilon, and if the relation were linear, at 100% utilization the
throughput would be expected to be epsilon/delta, but as the workload
increases, the utilization rises to 100%, but the throughput never rises
above 0.3*epsilon/delta? If so, what form of overhead is consuming 70%
of the "utilization"?
For what sort of workload mix does the statement apply? .......
</snip>
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html