Ah ... the old NUXI problem. :)

> I also ran into "endian" while documenting a UNIX application. A
> subject-matter expert (SME) patiently explained it to me, though my
> impression at the time was that he was a little uncertain. Perhaps the
> uncertainty was about the explanation itself, not about his grasp of
> the concept.

> His explanation was simply that some processor chips interpret binary
> code (ones and zeroes) from the "Big end" and others "Little end." So
> a binary number that appears like 1000 to one chip would appear as
> 0001 to the other. I forget whether the big end is on the left or the
> right.

Uh ... if you are showing a binary number above, this is not quite
right. The bits *within* the byte do not change their order in the two
common ENDIAN types. It is the *order* of the bytes within memory that
the processor accesses.

And, depending on whether you are accessing the memory in 16 bit words
or 32 bit words (and presumably nowadays, in 64 bit words) the outcome
is different.

Two final comments:

1. In my documents, I call it "byte order" rather than endianness. Much
simpler and cleaner. KISS applies!

2. The term comes from Gulliver's Travels in Lilliput. The two countries
could not agree on which "end" of the egg to crack first - the big or
the little. Hence two camps ... same as in the computing world! :)


Reply via email to