On 4/11/2019 8:40 AM, Phil Smith III wrote:
And I'm 99.9% sure that DASD capacity was determined by building the geometry and then trying various densities until error rates became unacceptable, then backing off slightly. Which would explain the weird, random sizes with each generation (until 3390, after which it went to arrays and became standardized-on what future generations will consider a weird size).
This reminds me of my first (junk pile) floppy disk drive back in the 1970's for my home-made computer. I had little money so I made my own controller out of a dozen chips and wrote some 8080 code to handle the I/O. So the format of the disk was totally up to me, and not compatible with anything else. I did just what you said and settled on about 3K per track. But that was with no separate records or sectors - you had to read the entire track if you wanted any data, which I found out later (when I took my first computer class) wasn't too smart.
But yeah, I remember looking at my dad's oscilloscope and adjusting the timing and size until the end of a track didn't overlay the start of the same track :)
---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [email protected] with the message: INFO IBM-MAIN
