OK, more rambling....

I forgot one other big thing the 360 designers got right: the byte (8-bit
characters).  I suppose they could have created a 7-bit architecture if it
was an ASCII-only hardware design.  There was a lot of pressure to reduce
the bits to cut costs.  The 8-bit byte architecture was the correct
decision.  It was (is) flexible and easy to extend.

There are a few vestiges of 7-bit characters in other computer systems due
to ASCII heritage effects, and it's not pretty.  Things like uuencoding,
xxencoding, and Kermit's 7-bit file transfer are all kludges to try to work
around 7-bit architectural restrictions.  They weren't fun.  I even wrote
such a workaround utility (called REXXShip), which translated a binary file
into a 64-bit wide character space and wrapped the whole thing in a
micro-sized self-executing REXX parser for easy end-user decoding, at least
under REXX-equipped OSes.  I think the inner content was
xxdecode-compatible, so if you didn't have REXX you could use the old
fashioned and more inconvenient method.

An awful lot of modems and serial connections had to handle 7-bit, too,
complicating the user experience for dial-up access to host systems, BBSes,
etc.  Basically if you set your modem to 7 bits, you struggled to transfer
binary files (see: Kermit), and PC extensions for things like line drawing
characters looked like a jumbled mess.  If you set your modem to 8 bits you
usually lost the parity bit, so you lost what little error checking you
had.  And a lot of systems still tried to use that high order bit for
parity, so you saw a jumbled mess on your PC again.  Owners of modem
dial-up pools installed workarounds to try to detect what the end user had
set, but this was a mess, too.  On some systems you wouldn't see anything,
so you didn't know what to do.  (The correct answer: hit Enter a few times,
or maybe Escape, or....)  I'm sure AT&T enjoyed some extra earnings as
dial-up modem users had to call over and over again, hoping to get the
configuration settings right through trial and error, all because of the
complications of 7 versus 8 bits.  This affected all sorts of serial
connections, including hardwired ones: plotters, ASCII terminals, etc.

Even parallel printer connections sometimes suffered from 7/8-bit issues.
Occasionally you'd buy a cable that worked great with 7-bit ASCII printers,
such as a daisywheel printer, only to be frustrated if you tried to move
that cable to an 8-bit device like a graphical dot matrix printer.  (The
manufacturer could save a little money wiring for 7 bits, so some of them
cut that corner -- or never tested the 8th bit on their 7-bit test rigs.
They got away with it for a while.)  This was even more fun if the physical
parallel port in the machine didn't actually support the 8th bit.  Products
like LapLink (which was itself a noble kludge) had to work around 7-bit
parallel ports and cables to push 8-bit binary files through, more slowly.

There was also the interesting fact that UNIX (and UNIX-like operating
systems) and PCs (including Apples) had (and still have) very different
ideas about the meaning of carriage return (CR) and line feed (LF)
protocols at the end of each line of text.  UNIX probably got it wrong, for
the sake of economy I guess (LF only).  PCs use CR+LF.  So with only 128
character slots the OS designers reading the ASCII specification still
disagreed about what the codes actually meant for their implementations.
And I'm just scratching the surface here.  It was so bad that the famous
Hayes modem didn't even use decimal 16 (data link escape) or any variation
thereof as the signal from the computer to the modem to go into command
mode, which would have been a defensible read of how to use ASCII in such
situations.  Instead, they used +++ followed by a delay (no further
character transmission) of about a second as the escape sequence.  Then
they patented +++ with the delay and were successful in preventing other
manufacturers from using +++/delay unless they paid royalties.  Whereupon
many manufacturers decided to implement +++ without the delay, the best
they could do for compatibility, and so there were good times for all
(except Hayes owners) as the mere transmission of any +++ sequence,
including any non-trivial form of ASCII artwork, resulted (usually) in
hanging up the phone as the modem got stuck in command mode.  So then the
terminal emulation software vendors had to work around that, usually by
inserting a software delay between the second and third plus sign.

In fact, if you look at any of these historic interchange codes you see
vestigal parts that don't have much use any more.  That includes almost
everything in ASCII below about decimal 32.

Anyway, there were some things to like about ASCII (and EBCDIC) and some
things not to like.  Life was not all wonderful in ASCII land.

- - - - -
Timothy Sipples
IBM Consulting Enterprise Software Architect
Specializing in Software Architectures Related to System z
Based in Tokyo, Serving IBM Japan and IBM Asia-Pacific
E-Mail: [EMAIL PROTECTED]
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to