On Sun, 6 Jan 2002 13:09:45   
 Joseph B. Reid wrote:
...
>I suspect that Marcus confuses digit length and word length.  A decimal
>digit requires 4 binary digits or bits.  Word length can be anything from 8
>bits in the IBM 1408 and 1410 up to 64 bits in the last Control Data
>computer.  The Control Data 6600 had a 60-bit word.
>...
As a matter of clarification, I didn't want to use programmer's jargon here as I don't 
know how many here would be used to computer technology technical terms.  But, 
evidently, what I meant was related to the concept of word length.  Not with regards 
to defining what a byte is, which has already been established as 8 bits.  I couldn't 
care less about that aspect.  This is something we all learn in our basic year at 
school of any computer-related degree program.  In the end though you won't even see 
us, professionals, make use of such things anymore.  It's all about # of bits that 
matters in the end, and how hardware are built around that amount.

Marcus


Is your boss reading your email? ....Probably
Keep your messages private by using Lycos Mail.
Sign up today at http://mail.lycos.com

Reply via email to