Stan:
 
This is the statement with which I have a problem: "The five bits of the
Baudot code are incorporated into the eight-bit ASCII code to ensure
compatibility between the old and new teletype machines."
 
I can understand (intellectually, but not practically) the creation of a 7-
or 8-bit code in which, say, the five low-order bits are ITA-2 code points.
However, such a code is not ASCII (or EBCDIC). In ITA-2 code (in which
letters are upper-case only, as they were in BCD), A, B, C and D are 11000
(24), 10011 (19), 01110 (14), and 10010 (18), respectively. They are not
even in ascending sequence and, of course, those same codes are used in
Figures Shift mode for =, ?, :, and $. In ASCII, the five low-order bits of
the codes for A, B, C and D are 00001, 00010, 00011, and 00100,
respectively. So there's no correspondence at all.
 
The older ("Telex") and the newer (TWX) teletype machines were not, in fact,
compatible. As their public networks were owned by competing companies
(Western Union and AT&T, I believe), compatibility wasn't much of an issue.
Devices (typically computer peripherals) were built that could handle any 5,
6, 7 or 8-bit paper tape, using any code (determined by the software
controlling them). I managed a computer service bureau, in 1971, where our
RJE (Remote Job Entry) terminal had paper tape capabilities, including
6-level tape containing output from cash registers. Incidentally, when
connected to computer-based networks, Telex and TWX terminals were able to
benefit from the code conversion capabilities of the computer software.
 
By the way, you may have missed my reference to 36-bit words in message
38931.
 
Best regards,
 
Bill Potts
SI Navigator (http://metric1.org) 


  _____  

From: STANLEY DOORE [mailto:[EMAIL PROTECTED] 
Sent: Monday, July 23, 2007 00:13
To: [EMAIL PROTECTED]; U.S. Metric Association
Subject: Re: [USMA:39147] Re: Discussion on the metric system (off topic --
of course)


Thanks Bill for the very detailed technical explanation.
    I wasn't trying to belabor the point, but only to tell how the eight-bit
ASCII code was adopted in practice.  The five bits of the Baudot code are
incorporated into the eight-bit ASCII code to ensure compatibility between
the old and new teletype machines and to use ASCII internally to computers
like we now have in PCs etc.
    Word length in early  IBM  computers were 36 bits which is not divisible
by 8 (ASCII).  That's why the  16-bit, 32-bit, 64-bit and 128-bit
(PlayStations and supercomputers) word machines  were created for
compatibility.
    Now that manufacturing of longer word machines is becoming less
expensive and necessary and the need for intense image and video processing,
industry is moving to longer word machines.  This is necessary as new
technology and high definition TV come onto the market and discs like the
BluRay which have a capacity of 50 GBytes are necessary.
Regards,  Stan Doore
 

----- Original Message ----- 
From: Bill  <mailto:[EMAIL PROTECTED]> Potts 
To: U.S. Metric Association <mailto:[email protected]>  
Sent: Sunday, July 22, 2007 8:25 PM
Subject: [USMA:39147] Re: Discussion on the metric system (off topic -- of
course)

Stan:
 
I'm afraid I must respectfully disagree with you about Baudot Code. The
five-bit code used for teletype machines was a 1930 variant, called ITA-2
(International Telegraph Alphabet No. 2). TWX (TeletypeWriter eXchange)
machines used ITA-5, otherwise known as ASCII. I will acknowledge, though,
that ITA-2 was known colloquially as Baudot Code. (See
<http://groups.msn.com/CTOSeaDogs/baudotcode1.msnw>
http://groups.msn.com/CTOSeaDogs/baudotcode1.msnw.)
 
However, that disagreement is really only semantic. More important is that
there is no relationship between the 5-bit codes and either 7- or 8-bit
ASCII (i.e., ASCII is not an extension of ITA-2, as even a cursory
examination of the two code tables will show). With the 5-bit codes, the
meaning depended on whether the device was in Letters Shift or Figures Shift
mode-and, of course, two of the code points were used for effecting the
shift. One of the virtues of ASCII and EBCDIC (and previously, of BCD) is
that, for a given natural language, every code point is unique and there's
no possibility of getting a garbled message because of being in the wrong
shift mode. Seven-bit ASCII includes Shift Out and Shift In code points,
used to change the character associated with some of the alphanumeric code
points. I've never worked with an actual implementation of that, though.
 
Another feature of ASCII and EBCDIC is the dedication of the lower-value
code points to control functions (0 to 31 [hex 1F] for ASCII, 0 to 63 [hex
3F] for EBCDIC). Other than Letters Shift, Figures Shift, Carriage Return,
Line Feed, and Space, ITA-2 had no assigned control code points. The ones
I've mentioned were independent of the shift mode and, therefore, could
legitimately be called control codes. As BEL (bell) was simply the Figures
Shift counterpart of the letter S (i.e., same code point, different shift
status), it can't be considered a control code.
 
Best regards,
 
Bill Potts
SI Navigator ( <http://metric1.org> http://metric1.org)


  _____  

From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf
Of STANLEY DOORE
Sent: Sunday, July 22, 2007 15:06
To: U.S. Metric Association
Subject: [USMA:39145] Re: Discussion on the metric system (off topic -- of
course)



Hi Bill et al:
    Sounds like you and I came from the same era (circa 1958) of punched
cards.  I was on the US federal advisory committee for standardizing on the
eight-bit ASCII code.  We selected the eight-bit ASCII code as the base even
though IBM wanted a BCD-based system.
    At the time, the whole world used the five-bit baudot code  in
communications and Digital Equipment Corporation computers used an extension
of it (ASCII) internal to their computers.  It meant that the conversion
would be less stressful, less complex and more compatible by expanding the
five-bit baudot code to the eight-bit ASCII code for various reasons
Including the accommodation  of international and special characters for
both communications and computers.  Eight bits became a byte in computers
now used today while six bits were used to represent characters in early
machines of IBM etc.
Regards,  Stan Doore
 
 

----- Original Message ----- 
From: Bill  <mailto:[EMAIL PROTECTED]> Potts 
To: U.S. Metric Association <mailto:[email protected]>  
Sent: Saturday, July 21, 2007 4:20 PM
Subject: [USMA:39118] Re: Discussion on the metric system (off topic -- of
course)

Stan:
 
Please excuse the delayed response. I only visit this list occasionally
these days.
 
Although this business of codes is obviously somewhat off-topic, it's
interesting, especially to those of us concerned with the niceties of the
metric system and therefore of the view (probably, but not necessarily) that
there's no such thing as an uninteresting number (or, apparently, code).
 
The interesting thing about EBCDIC is that, as with the old 6-bit BCD,
there's a direct correspondence between the encoding of any given character
and its representation, as punch holes, on the now-obsolete punch cards.
Every one of the 256 values has a corresponding set of punch holes. And, of
course, as the punch card came first, EBCDIC code points are based on that,
rather than the other way around.
 
Used to the maximum, the 12 rows of a punch card column could, of course,
accommodate 4096 unique values. IBM's "scientific" 7000 series computers
used row binary to take advantage of that, with the first 72 columns of one
card being able to store the contents of twenty-four 36-bit words.
 
However, although looking back is fun, I'm glad technology has moved on.
I've never missed those days of humping ten-thousand-card cartons of punch
cards around the computer room (or the card jams or the dropped cards).
 
Bill

  _____  

From: G Stanley Doore [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, June 20, 2007 10:20
To: [EMAIL PROTECTED]; U.S. Metric Association
Subject: Re: [USMA:38932] Re: Discussion on the metric system


Thanks Bill for the correction and further explanation.
EBCDIC was invented to use the full 8 bits for expanded representations.
Stan Doore
 

----- Original Message ----- 
From: Bill Potts <mailto:[EMAIL PROTECTED]>  
To: U.S. Metric Association <mailto:[email protected]>  
Sent: Tuesday, June 19, 2007 4:33 PM
Subject: [USMA:38932] Re: Discussion on the metric system

Stan Doore wrote: "IBM invented the hexadecimal to provide for all types of
international characters and many special symbols."
 
Not quite. For that purpose, they invented and introduced EBCDIC (Extended
Binary Coded Decimal Interchange Code), for which the unit was/is the byte,
defined as a group of 8 bits. Because the three-bit grouping of the octal
notation was potentially awkward, they introduced four-bit [half byte]
hexadecimal notation, which already existed conceptually, but had no
practical application in the days of computers with 36-bit word sizes (e.g.,
the IBM 7090). Any EBCDIC value was thus expressible as 2 hexadecimal digits
(as was, eventually, any 8-bit ISO 646 [ASCII in the US] value).
 
Of course, it was still awkward, in that we all had to learn to use A
through F for the six four-bit groupings beyond the one expressed as 9. 
 
Code points in today's 16-bit Unicode are, of course, expressible as strings
of four hexadecimal digits.
 
Bill Potts 
(whose first experience with a computer was on the Burroughs E101 Desk Size
Engineering Computer, with its 256 10-digit decimal words on a drum,
plugboard programming, and a contemporary accounting-machine numerals-only
print mechanism).
 
 
 

Reply via email to