I didn't see my posting below come through the list server, but I didn't I get 
a bounced mail message either,  so I'm resending this. I apologize if this is a 
duplicate.

Ezra

================================

I went to the Simpson movie web site and was surprised to see two flavors of 
English listed for the language selection: English and "English (outside of 
North America"). The second selection showed the same intro scene as the first, 
but with the distance road sign in the foreground using "km" instead of "mi".

The only downside is that Canadians are pointed to the wrong site since all of 
their road signs are in metric. I tried to find a place to send a comment on 
the web site but there was nothing. All I could find was snail mail address and 
only for FOX TV shows on the FOX web site. 

Perhaps they should have listed the choices as "US English" and "English 
(outside the USA)". That way they could also use the British spelling on the 
second site and conform better than US spelling does to most local customs 
(even though I recognize (recognise?) that current British spelling isn't used 
100% in all English-speaking countries outside the USA).


--- Begin Message ---
Title: Re: [USMA:38926] Re: Discussion on the metric system
Hi Bill et al:
    Sounds like you and I came from the same era (circa 1958) of punched cards.  I was on the US federal advisory committee for standardizing on the eight-bit ASCII code.  We selected the eight-bit ASCII code as the base even though IBM wanted a BCD-based system.
    At the time, the whole world used the five-bit baudot code  in communications and Digital Equipment Corporation computers used an extension of it (ASCII) internal to their computers.  It meant that the conversion would be less stressful, less complex and more compatible by expanding the five-bit baudot code to the eight-bit ASCII code for various reasons Including the accommodation  of international and special characters for both communications and computers.  Eight bits became a byte in computers now used today while six bits were used to represent characters in early machines of IBM etc.
Regards,  Stan Doore
 
 
----- Original Message -----
From: Bill Potts
Sent: Saturday, July 21, 2007 4:20 PM
Subject: [USMA:39118] Re: Discussion on the metric system (off topic -- of course)

Stan:
 
Please excuse the delayed response. I only visit this list occasionally these days.
 
Although this business of codes is obviously somewhat off-topic, it's interesting, especially to those of us concerned with the niceties of the metric system and therefore of the view (probably, but not necessarily) that there's no such thing as an uninteresting number (or, apparently, code).
 
The interesting thing about EBCDIC is that, as with the old 6-bit BCD, there's a direct correspondence between the encoding of any given character and its representation, as punch holes, on the now-obsolete punch cards. Every one of the 256 values has a corresponding set of punch holes. And, of course, as the punch card came first, EBCDIC code points are based on that, rather than the other way around.
 
Used to the maximum, the 12 rows of a punch card column could, of course, accommodate 4096 unique values. IBM's "scientific" 7000 series computers used row binary to take advantage of that, with the first 72 columns of one card being able to store the contents of twenty-four 36-bit words.
 
However, although looking back is fun, I'm glad technology has moved on. I've never missed those days of humping ten-thousand-card cartons of punch cards around the computer room (or the card jams or the dropped cards).
 
Bill

From: G Stanley Doore [mailto:[EMAIL PROTECTED]
Sent: Wednesday, June 20, 2007 10:20
To: [EMAIL PROTECTED]; U.S. Metric Association
Subject: Re: [USMA:38932] Re: Discussion on the metric system

Thanks Bill for the correction and further explanation.
EBCDIC was invented to use the full 8 bits for expanded representations.
Stan Doore
 
----- Original Message -----
From: Bill Potts
Sent: Tuesday, June 19, 2007 4:33 PM
Subject: [USMA:38932] Re: Discussion on the metric system

Stan Doore wrote: "IBM invented the hexadecimal to provide for all types of international characters and many special symbols."
 
Not quite. For that purpose, they invented and introduced EBCDIC (Extended Binary Coded Decimal Interchange Code), for which the unit was/is the byte, defined as a group of 8 bits. Because the three-bit grouping of the octal notation was potentially awkward, they introduced four-bit [half byte] hexadecimal notation, which already existed conceptually, but had no practical application in the days of computers with 36-bit word sizes (e.g., the IBM 7090). Any EBCDIC value was thus expressible as 2 hexadecimal digits (as was, eventually, any 8-bit ISO 646 [ASCII in the US] value).
 
Of course, it was still awkward, in that we all had to learn to use A through F for the six four-bit groupings beyond the one expressed as 9.
 
Code points in today's 16-bit Unicode are, of course, expressible as strings of four hexadecimal digits.
 
Bill Potts
(whose first experience with a computer was on the Burroughs E101 Desk Size Engineering Computer, with its 256 10-digit decimal words on a drum, plugboard programming, and a contemporary accounting-machine numerals-only print mechanism).
 
 
 

--- End Message ---

Reply via email to