Andrew Lentvorski wrote:
>.. Bob La Quey wrote:
>> Hmm ... kind of fun.
>>
>> But it also leads toward an intriguing idea.
>>
>> Metric is good, but really should be based on division by
>> two not ten. Division by two is much more natural than
>> division by ten.
> 
> Sorry.  I disagree.  I much prefer working in units that map to my
> native number system..

..I have addressed this issue in the past. Can't find my original, but
Google got it back for me. Here's a commentary from circa 1998.

- - -

I respectfully submit that some previous message contributors suffer
from cultural parochialism. I don't want to single out any one person,
since I believe that the affliction may be quite widespread, but a
sample quote runs something like:

>...Only computers think in binary. I have 10 fingers and 10 toes so
>I think in decimal numbers. ...

If one can only suspend the conventional model implanted by years of
institutional mathematics training, and objectively re-examine human
architecture, it must be acknowledged that the canonical accumulating
device for manual calculation is best represented by a hexadecimal base
number system.

Clearly, each hand has 4 binary digits ("bits") and of course, a carry
flag (sometimes called half-carry, or even "thumb").  Other names for
hand, used in various technical disciplines, are "half-adder" or "nibble".

With a standard complement of two hands, you can count to FF (255,
decimal). For reasons, unknown to me, the 2-handed unit has come to be
called a "byte".

By pressing other appendages into service, one may further extend the
counting limit to FFFF (65535, decimal). There seems to some
disagreement on what to call the storage unit formed by combining hands
and feet. One sees word, short, and even in times past, int. Would
anybody care to vote for "fourhand"?

Arguably, one could insist that everything can ALWAYS be broken down
into fingers (or "bits"). In fact, I AGREE that bit IS more fundamental
than hand (which explains why computers, which can only "think" in the
simplest sense of the term, make heavy use of the binary number system).
However, it is my observation that humans, prefer a higher level of
abstraction.

Incidentally, there is no truth to the ugly notion (advanced by
kernel-hackers, no doubt) that most humans use names such as "thirteen"
instead of "1101" simply because they are slow-witted. I know that I
have no trouble equating "index-finger + ring-finger + pinky" with
"thirteen" (or hex D), and the shorter representation certainly greatly
aids human communication and conceptualization.

Despite the convenient practice of using shorthand names for various
handful-multiples -- namely, 2-hand units, 4-hand units, and even 8- or
16-hand units (8,16,32,64 bits, respectively), most would agree that the
hexadecimal "hand" is the natural chunk-size for human reasoning and
communication. I trust that these clarifying observations will reduce
the message clutter rooted in the cultural attachment to the artificial
and messy decimal system!

---

P.S.    Does anyone know for sure whether the story about the 3-fingered
electronics tech who built the PDP-7 prototype is true? I mean, is that
really why octal snuck into the original Bell Labs software?

Regards to all...jim


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to