Phil Smith III wrote:
(Cross-posted to IBM-VM and IBM-MAIN)
A buddy asked me:
"At a previous employer, someone had an article, poster or something (I know - real
specific - it was 15+ years ago) that tried to put the time for computer events into
perspective. It started with the quickest instruction (RR) having a baseline of 1 second.
It the proceeded to go through all of the instructions, RX, RS, SS etc. and then into
I/O, MIH and so on. Have you ever heard or seen anything like this? I'm having trouble
stressing the importance of poor I/O response time and I thought this might be of
use."
I had to tell him I hadn't ever seen such a thing, but would like to. I figure
if anyone else alive knows what this is/was, they'll be on one of these two
lists...!
Anyone?
When I got into system programming in 82 I remember something about
comparing computer time to "people" speed.
The situation was the fastest, best, and brightest computer operator was
standing in front of the console when a WTOR appeared. They knew that
the WTOR was going to appear, they knew what the reply was going to be,
they had their fingers over the keyboard ready to type, and as soon as
it popped up they typed in the reply as fast as they could.
If it took on average 1 second to execute an instruction, it just took
that operator 1 year to reply to the WTOR.
I don't know what machine type they were referring to, I don't think it
mattered that much at that time.
There are just over 31.5 million seconds in a year.
I can't remember the whole thing, but I believe that Grace Hopper used
to use different rope lengths to show how long, or short various
measurements of time were: a nano second vs. a full second.
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html