----- Original Message -----
Sent: Saturday, July 21, 2007 4:20
PM
Subject: [USMA:39118] Re: Discussion on
the metric system (off topic -- of course)
Stan:
Please excuse the delayed response. I only visit this
list occasionally these days.
Although this business of codes is obviously somewhat
off-topic, it's interesting, especially to those of us concerned with the
niceties of the metric system and therefore of the view (probably, but not
necessarily) that there's no such thing as an uninteresting number (or,
apparently, code).
The
interesting thing about EBCDIC is that, as with the old 6-bit BCD, there's a
direct correspondence between the encoding of any given character and its
representation, as punch holes, on the now-obsolete punch cards. Every one of
the 256 values has a corresponding set of punch holes. And, of
course, as the punch card came first, EBCDIC code points are based on that,
rather than the other way around.
Used
to the maximum, the 12 rows of a punch card column could, of course,
accommodate 4096 unique values. IBM's "scientific" 7000 series computers used
row binary to take advantage of that, with the first 72 columns of one card
being able to store the contents of twenty-four 36-bit
words.
However, although looking back is fun, I'm glad
technology has moved on. I've never missed those days of humping
ten-thousand-card cartons of punch cards around the computer room (or the card
jams or the dropped cards).
Bill
Thanks Bill for the correction and further
explanation.
EBCDIC was invented to use the full 8 bits for
expanded representations.
Stan Doore
----- Original Message -----
Sent: Tuesday, June 19, 2007 4:33
PM
Subject: [USMA:38932] Re: Discussion
on the metric system
Stan Doore
wrote: "IBM invented the hexadecimal to provide for all types of
international characters and many special
symbols."
Not quite. For that
purpose, they invented and introduced EBCDIC (Extended Binary Coded
Decimal Interchange Code), for which the unit was/is the byte,
defined as a group of 8 bits. Because the three-bit grouping of the octal
notation was potentially awkward, they introduced four-bit [half byte]
hexadecimal notation, which already existed conceptually,
but had no practical application in the days of computers with 36-bit word
sizes (e.g., the IBM 7090). Any EBCDIC value was thus expressible as
2 hexadecimal digits (as was, eventually, any 8-bit ISO 646 [ASCII in the
US] value).
Of course, it was still
awkward, in that we all had to learn to use A through F for the six
four-bit groupings beyond the one expressed as 9.
Code points in today's
16-bit Unicode are, of course, expressible as strings of four hexadecimal
digits.
Bill Potts
(whose
first experience with a computer was on the Burroughs E101 Desk Size
Engineering Computer, with its 256 10-digit decimal words on a drum,
plugboard programming, and a contemporary accounting-machine numerals-only
print mechanism).