> A lecturer at KTH told me about coding way-back-when: You turned
> in your code on cards to the computer operator and had them
> processed. Then, whenever the computer was finished (depending
> on load and other people's jobs queued up to go), the operator
> would give you back a printout of the program output.
Actually, we had access to the card readers, and we had bins where the
printouts were put by the operators. Earning a machine room key was a
milestone, and not for everyone. The cognoscenti waited until the evening
backup finished at 2330 or so, since most people would leave when access was
disabled for backup. The term "Moore School mole" was coined to refer to
people who came upstairs to see the sun when the morning shift operators
arrived. Generally they'd send out a little weather report when they got to
the operator's console.
For my the final project of my 3rd year operating systems course, most
students were averaging about 6000 CPU seconds per batch for each of 5
batches. I wrote the code a bit more cleverly, and it ran in 400 seconds
for the entire suite. The trick? Why simulate time that didn't have
anything interesting happening? Mind you, I spent a lot of time checking
the results because I couldn't quite believe that it was *that* much faster.
The first mainframe computer I used required manually punching cards. After
that experience, I swore I'd never use one again, and I never did, although
I did use half-duplex terminals into which we'd hacked an editor (if I
recall correctly, David Garfield did the first hack, and a group of us built
on it).
First language: assembly followed by ALGOL
First book: Techniques of Program Structure and Design (Yourdon)
--- Noel
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]