On Wed, 19 Sep 2012 14:25:11 +0000, Bill Fairchild wrote:
>
>It could be a lot worse. Hardware engineers number the bits in a byte in the
>opposite manner that we software techies do; i.e., bit 0 (hardware) = bit 7
>(software), etc. I think hardware people must consider bit 0, the rightmost
>bit in their world view, to represent two to the zero-th power, so I
>understand why they number the bits from right to left. There are also many
>languages that are written, and thus must be read, from right to left, and
>some languages anciently were even written both ways on the same stone
>document using the boustrophedon method described once by John Gilmore.
>
"Left" and "right" may be not very meaningful here. A colleague once
asked me,
"Does this computer store bits left-to-right or right-to-left?"
"Point of view. If I look at the indicator lights on the front panel,
it appears to be left-to-right; if I walk around and open the back
panel, it appears right-to-left. If I stand it on its side as a tower..."
"You know what I mean!"
Actually, I didn't. Does he mean how they appear in the engineering
drawings? How they're conventionally numbered? Other?
I once looked at a VAX (little-endian) dump. The ASCII and hex
appeared side-by-side, in opposite reading directions.
How are Arabic numbers written in a paragraph of Arabic (or Hebrew)
text? (This may depend on behavior of specific word processors.)
-- gil
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN